Apr 24 21:13:31.542890 ip-10-0-134-248 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:13:31.542903 ip-10-0-134-248 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:13:31.542910 ip-10-0-134-248 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:13:31.543118 ip-10-0-134-248 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:13:41.772061 ip-10-0-134-248 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:13:41.772084 ip-10-0-134-248 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 447fb130141f4ecbb19c285639f216e3 -- Apr 24 21:16:15.898046 ip-10-0-134-248 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:16:16.392673 ip-10-0-134-248 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:16.392673 ip-10-0-134-248 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:16:16.392673 ip-10-0-134-248 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:16.392673 ip-10-0-134-248 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:16:16.392673 ip-10-0-134-248 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:16.395269 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.395193 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:16:16.401848 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401829 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:16.401848 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401846 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:16.401848 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401849 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:16.401848 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401852 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:16.401848 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401855 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401858 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401860 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401863 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401865 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401868 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401870 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401873 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401876 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401878 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401881 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401883 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401886 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401888 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401890 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401893 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401895 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401898 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401900 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401902 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:16.402022 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401905 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401907 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401910 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401912 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401918 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401921 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401923 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401927 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401929 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401932 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401935 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401937 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401940 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401943 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401946 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401948 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401951 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401953 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401957 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401962 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:16.402485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401964 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401967 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401969 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401971 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401974 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401976 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401980 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401983 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401985 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401988 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401990 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401993 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401995 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.401998 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402001 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402003 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402005 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402008 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402010 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:16.402974 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402013 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402015 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402018 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402021 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402023 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402026 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402029 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402031 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402034 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402036 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402039 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402042 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402044 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402048 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402052 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402056 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402059 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402062 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402064 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:16.403401 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402067 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402069 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402072 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402076 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402459 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402465 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402467 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402470 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402472 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402475 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402478 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402480 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402483 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402485 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402488 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402490 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402493 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402495 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402498 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402501 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:16.403853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402504 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402506 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402510 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402513 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402516 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402519 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402521 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402523 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402526 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402529 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402531 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402535 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402539 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402542 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402545 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402549 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402552 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402555 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402558 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402561 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:16.404312 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402563 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402566 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402568 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402571 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402575 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402579 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402582 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402586 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402588 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402591 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402593 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402596 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402598 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402601 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402603 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402605 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402608 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402610 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402612 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:16.404809 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402615 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402617 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402620 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402623 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402625 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402627 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402630 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402632 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402635 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402638 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402642 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402644 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402646 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402648 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402651 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402653 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402655 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402658 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402660 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402662 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:16.405250 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402665 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402667 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402670 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402672 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402675 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402677 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402679 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402681 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402684 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402686 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.402688 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403518 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403527 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403534 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403538 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403544 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403547 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403552 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403560 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403563 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403566 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:16:16.405719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403570 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403573 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403576 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403579 2578 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403582 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403585 2578 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403587 2578 flags.go:64] FLAG: --cloud-config="" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403590 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403593 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403597 2578 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403599 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403602 2578 flags.go:64] FLAG: --config-dir="" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403605 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403608 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403612 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403615 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403618 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403621 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403624 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403627 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403630 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403633 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403636 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403644 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403647 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:16:16.406253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403650 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403652 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403656 2578 flags.go:64] FLAG: --enable-server="true" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403659 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403665 2578 flags.go:64] FLAG: --event-burst="100" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403668 2578 flags.go:64] FLAG: --event-qps="50" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403670 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403673 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403677 2578 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403681 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403683 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403686 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403689 2578 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403692 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403695 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403698 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403700 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403703 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403706 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403708 2578 flags.go:64] FLAG: --feature-gates="" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403712 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403715 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403718 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403721 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403724 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:16:16.406838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403727 2578 flags.go:64] FLAG: --help="false" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403729 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403732 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403735 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403738 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403741 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403757 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403760 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403763 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403766 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403769 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403773 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403776 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403779 2578 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403781 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403784 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403787 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403790 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403793 2578 flags.go:64] FLAG: --lock-file="" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403795 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403798 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403801 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403806 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:16:16.407418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403809 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403812 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403814 2578 flags.go:64] FLAG: --logging-format="text" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403817 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403820 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403823 2578 flags.go:64] FLAG: --manifest-url="" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403825 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403829 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403832 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403836 2578 flags.go:64] FLAG: --max-pods="110" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403839 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403842 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403845 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403847 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403850 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403853 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403856 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403862 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403865 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403868 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403872 2578 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403875 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403881 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403883 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:16:16.408009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403887 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403890 2578 flags.go:64] FLAG: --port="10250" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403893 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403895 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05bd01d5bfa0a9ec1" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403898 2578 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403901 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403904 2578 flags.go:64] FLAG: --register-node="true" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403906 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403909 2578 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403912 2578 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403915 2578 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403917 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403920 2578 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403923 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403926 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403929 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403931 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403934 2578 flags.go:64] FLAG: --runonce="false" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403937 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403940 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403943 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403945 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403948 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403951 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403954 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403957 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:16:16.408567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403959 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403962 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403964 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403968 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403971 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403974 2578 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403976 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403981 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403984 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403987 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403990 2578 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403993 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403995 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.403998 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.404001 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.404004 2578 flags.go:64] FLAG: --v="2" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.404008 2578 flags.go:64] FLAG: --version="false" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.404011 2578 flags.go:64] FLAG: --vmodule="" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.404015 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.404018 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404105 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404109 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404111 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404115 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:16.409276 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404117 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404119 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404122 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404124 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404127 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404129 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404132 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404134 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404136 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404140 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404144 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404147 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404150 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404152 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404155 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404157 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404159 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404162 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404164 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404166 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:16.409853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404169 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404171 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404174 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404176 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404178 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404180 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404183 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404185 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404187 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404190 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404192 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404197 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404200 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404202 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404204 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404206 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404209 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404211 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404213 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404215 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:16.410352 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404218 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404220 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404223 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404225 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404227 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404230 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404232 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404235 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404237 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404239 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404242 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404244 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404247 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404250 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404253 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404255 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404258 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404262 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:16.410860 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404265 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404268 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404271 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404273 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404275 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404278 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404280 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404282 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404284 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404287 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404289 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404292 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404294 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404296 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404299 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404301 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404305 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404307 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404310 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404312 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:16.411297 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404318 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404320 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404323 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.404325 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.404961 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.411294 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.411307 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411355 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411361 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411364 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411367 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411369 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411372 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411375 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411377 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:16.411797 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411381 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411383 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411386 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411388 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411391 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411393 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411396 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411398 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411401 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411403 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411405 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411408 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411410 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411412 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411415 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411417 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411419 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411422 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411424 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:16.412157 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411426 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411429 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411431 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411434 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411438 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411440 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411443 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411445 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411447 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411451 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411455 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411457 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411460 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411462 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411465 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411467 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411470 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411473 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411477 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:16.412623 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411479 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411482 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411485 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411488 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411490 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411493 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411495 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411498 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411500 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411502 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411504 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411507 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411509 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411511 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411513 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411516 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411518 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411521 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411524 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411527 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:16.413116 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411529 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411532 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411534 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411536 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411539 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411541 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411543 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411546 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411548 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411550 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411553 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411555 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411558 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411560 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411562 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411564 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411566 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411569 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411571 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:16.413595 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411573 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.411578 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411679 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411684 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411687 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411689 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411692 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411695 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411698 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411700 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411703 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411705 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411708 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411711 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411713 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:16.414058 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411715 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411718 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411720 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411722 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411725 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411727 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411729 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411731 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411734 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411736 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411738 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411741 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411758 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411761 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411764 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411766 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411768 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411771 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411774 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411776 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:16.414403 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411778 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411781 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411783 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411785 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411788 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411790 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411793 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411796 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411800 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411803 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411807 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411809 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411812 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411815 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411817 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411819 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411822 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411824 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411827 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:16.414891 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411829 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411831 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411834 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411836 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411839 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411842 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411844 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411846 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411849 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411851 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411854 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411856 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411858 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411861 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411863 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411865 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411868 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411870 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411872 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411875 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:16.415371 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411877 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411880 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411882 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411884 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411887 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411890 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411892 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411895 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411899 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411901 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411904 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411906 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411908 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:16.411911 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.411915 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:16.415871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.413253 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:16:16.416214 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.415248 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:16:16.416268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.416257 2578 server.go:1019] "Starting client certificate rotation" Apr 24 21:16:16.416414 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.416397 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:16.417354 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.417344 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:16.447102 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.447085 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:16.449672 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.449653 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:16.463070 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.463003 2578 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:16:16.471575 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.471560 2578 log.go:25] "Validated CRI v1 image API" Apr 24 21:16:16.472865 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.472846 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:16:16.479033 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.479017 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:16.481281 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.481262 2578 fs.go:135] Filesystem UUIDs: map[17fb196f-52e5-42a1-a4ce-493084846f48:/dev/nvme0n1p3 669757d4-a4da-44b8-880f-899013378f2d:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 21:16:16.481355 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.481281 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:16:16.487170 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.487064 2578 manager.go:217] Machine: {Timestamp:2026-04-24 21:16:16.485083625 +0000 UTC m=+0.452885689 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200612 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e195f0bddfbcc97a847906a77fb36 SystemUUID:ec2e195f-0bdd-fbcc-97a8-47906a77fb36 BootID:447fb130-141f-4ecb-b19c-285639f216e3 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4e:b2:7b:fc:33 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4e:b2:7b:fc:33 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:43:93:db:a6:ce Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:16:16.487170 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.487165 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:16:16.487270 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.487236 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:16:16.488416 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.488393 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:16:16.488550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.488419 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-248.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:16:16.488594 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.488559 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:16:16.488594 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.488568 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:16:16.488594 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.488584 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:16.490459 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.490448 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:16.491280 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.491270 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:16.491376 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.491367 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:16:16.493849 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.493839 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:16:16.493882 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.493852 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:16:16.493882 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.493866 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:16:16.493882 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.493876 2578 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:16:16.493990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.493885 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:16:16.495030 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.495019 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:16.495080 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.495037 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:16.498459 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.498434 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:16:16.499842 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.499829 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:16:16.501198 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501183 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wqsvb" Apr 24 21:16:16.501589 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501578 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:16:16.501620 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501607 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:16:16.501620 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501614 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:16:16.501620 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501620 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:16:16.501693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501626 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:16:16.501693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501631 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:16:16.501693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501637 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:16:16.501693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501643 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:16:16.501693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501649 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:16:16.501693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501655 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:16:16.501693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501672 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:16:16.501693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.501680 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:16:16.503688 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.503675 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:16:16.503688 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.503687 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:16:16.505615 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.505306 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-248.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:16.505681 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.505309 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:16.507315 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.507303 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:16:16.507360 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.507342 2578 server.go:1295] "Started kubelet" Apr 24 21:16:16.507448 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.507424 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:16:16.507961 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.507923 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:16:16.507994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.507981 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:16:16.508214 ip-10-0-134-248 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:16:16.509343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.509327 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wqsvb" Apr 24 21:16:16.510220 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.510204 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:16:16.510830 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.510813 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:16:16.516455 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.516228 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:16:16.516455 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.516366 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:16.518326 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.518190 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:16:16.518412 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.518331 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:16:16.518412 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.518392 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:16.519266 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.519246 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:16:16.519352 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.519265 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:16:16.520061 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.520039 2578 factory.go:55] Registering systemd factory Apr 24 21:16:16.520189 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.520169 2578 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:16:16.520616 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.520603 2578 factory.go:153] Registering CRI-O factory Apr 24 21:16:16.520708 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.520698 2578 factory.go:223] Registration of the crio container factory successfully Apr 24 21:16:16.521103 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.521084 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:16:16.521185 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.521121 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:16:16.521185 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.521148 2578 factory.go:103] Registering Raw factory Apr 24 21:16:16.521185 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.521171 2578 manager.go:1196] Started watching for new ooms in manager Apr 24 21:16:16.521315 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.521300 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:16:16.522564 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.522541 2578 manager.go:319] Starting recovery of all containers Apr 24 21:16:16.524863 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.524843 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:16.526639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.526598 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-248.ec2.internal" not found Apr 24 21:16:16.529168 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.529143 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-248.ec2.internal\" not found" node="ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.534967 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.534800 2578 manager.go:324] Recovery completed Apr 24 21:16:16.539316 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.539301 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:16.542143 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.542128 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:16.542203 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.542155 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:16.542203 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.542185 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:16.542643 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.542631 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:16:16.542687 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.542645 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:16:16.542687 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.542663 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:16.543040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.543026 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-248.ec2.internal" not found Apr 24 21:16:16.544984 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.544973 2578 policy_none.go:49] "None policy: Start" Apr 24 21:16:16.545025 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.544988 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:16:16.545025 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.545003 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.586283 2578 manager.go:341] "Starting Device Plugin manager" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.586313 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.586322 2578 server.go:85] "Starting device plugin registration server" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.586529 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.586538 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.586619 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.586688 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.586696 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.587190 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:16:16.588011 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.587237 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:16.600044 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.600030 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-248.ec2.internal" not found Apr 24 21:16:16.653741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.653700 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:16:16.655019 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.654999 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:16:16.655085 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.655023 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:16:16.655085 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.655038 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:16:16.655085 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.655044 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:16:16.655085 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.655074 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:16:16.658703 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.658689 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:16.686797 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.686776 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:16.687554 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.687541 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:16.687608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.687566 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:16.687608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.687576 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:16.687608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.687597 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.697336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.697320 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.697388 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.697341 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-248.ec2.internal\": node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:16.713882 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.713864 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:16.755502 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.755469 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal"] Apr 24 21:16:16.755569 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.755545 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:16.756216 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.756204 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:16.756258 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.756228 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:16.756258 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.756240 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:16.757411 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.757401 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:16.757553 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.757540 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.757589 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.757568 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:16.758107 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.758086 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:16.758107 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.758098 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:16.758222 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.758115 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:16.758222 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.758120 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:16.758222 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.758125 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:16.758222 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.758130 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:16.759148 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.759134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.759215 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.759156 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:16.759737 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.759723 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:16.759826 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.759767 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:16.759826 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.759779 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:16.784232 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.784212 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-248.ec2.internal\" not found" node="ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.790121 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.790104 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-248.ec2.internal\" not found" node="ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.814643 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.814626 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:16.915435 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:16.915393 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:16.922827 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.922812 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f85132ac7426286517df0d79e1a6c22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal\" (UID: \"8f85132ac7426286517df0d79e1a6c22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.922900 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.922835 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c544ccdf1879496756152eb8f8b28eb-config\") pod \"kube-apiserver-proxy-ip-10-0-134-248.ec2.internal\" (UID: \"6c544ccdf1879496756152eb8f8b28eb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" Apr 24 21:16:16.922900 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:16.922853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8f85132ac7426286517df0d79e1a6c22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal\" (UID: \"8f85132ac7426286517df0d79e1a6c22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.016171 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.016146 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:17.023614 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.023598 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c544ccdf1879496756152eb8f8b28eb-config\") pod \"kube-apiserver-proxy-ip-10-0-134-248.ec2.internal\" (UID: \"6c544ccdf1879496756152eb8f8b28eb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.023665 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.023643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c544ccdf1879496756152eb8f8b28eb-config\") pod \"kube-apiserver-proxy-ip-10-0-134-248.ec2.internal\" (UID: \"6c544ccdf1879496756152eb8f8b28eb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.023702 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.023686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8f85132ac7426286517df0d79e1a6c22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal\" (UID: \"8f85132ac7426286517df0d79e1a6c22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.023733 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.023706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f85132ac7426286517df0d79e1a6c22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal\" (UID: \"8f85132ac7426286517df0d79e1a6c22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.023781 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.023738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f85132ac7426286517df0d79e1a6c22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal\" (UID: \"8f85132ac7426286517df0d79e1a6c22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.023813 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.023797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8f85132ac7426286517df0d79e1a6c22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal\" (UID: \"8f85132ac7426286517df0d79e1a6c22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.086784 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.086739 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.093305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.093290 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.116843 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.116818 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:17.217456 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.217413 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:17.317975 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.317957 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:17.369705 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.369685 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:17.416891 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.416873 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:16:17.417290 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.416988 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:17.417290 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.417036 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:17.417290 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.417043 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:17.419029 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.419013 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-248.ec2.internal\" not found" Apr 24 21:16:17.446664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.446648 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:17.494252 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.494231 2578 apiserver.go:52] "Watching apiserver" Apr 24 21:16:17.503577 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.503555 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:16:17.506946 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.506920 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-84bkj","openshift-multus/multus-vvqbk","openshift-multus/network-metrics-daemon-bcqjb","openshift-network-diagnostics/network-check-target-vrf9q","openshift-network-operator/iptables-alerter-8qkh9","openshift-ovn-kubernetes/ovnkube-node-49kt7","kube-system/konnectivity-agent-kt72p","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7","openshift-cluster-node-tuning-operator/tuned-5dm7j","openshift-dns/node-resolver-pvc6x","openshift-image-registry/node-ca-xtf2m"] Apr 24 21:16:17.509176 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.509160 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.510277 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.510262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:17.510344 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.510329 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:17.511606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.511346 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:16:17.511606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.511367 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.511606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.511436 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:16:17.511606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.511467 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:16:17.511606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.511512 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:16:17.511606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.511566 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bsw77\"" Apr 24 21:16:17.512457 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.512424 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:11:16 +0000 UTC" deadline="2028-01-09 11:15:37.149153196 +0000 UTC" Apr 24 21:16:17.512457 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.512456 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14989h59m19.636699329s" Apr 24 21:16:17.512554 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.512522 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.512601 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.512549 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:17.513357 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.513170 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:17.513633 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.513451 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:17.513633 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.513584 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:16:17.513633 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.513612 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mct2p\"" Apr 24 21:16:17.514007 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.513987 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.515325 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.515195 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.515325 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.515249 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:16:17.515325 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.515284 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:16:17.517343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.516071 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:16:17.517343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.516423 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:16:17.517343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.516566 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:16:17.517343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.516687 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:17.517343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.516943 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:16:17.517343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.517170 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-r4rp4\"" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.517940 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.517961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518011 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518042 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s5q26\"" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.518058 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518014 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518227 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518252 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518316 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:16:17.518359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518334 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6ppx7\"" Apr 24 21:16:17.518917 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518468 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.518917 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.518775 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cx7l9\"" Apr 24 21:16:17.519838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.519813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.522929 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.522913 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.523532 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.523516 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:16:17.523615 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.523572 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:16:17.523663 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.523613 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:16:17.523778 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.523759 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pmvlf\"" Apr 24 21:16:17.524033 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.524016 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.524886 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.524868 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k64g9\"" Apr 24 21:16:17.524957 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.524914 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:17.524992 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.524975 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:17.525945 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.525928 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:16:17.526020 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.525942 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:16:17.526118 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526101 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:16:17.526220 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526102 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btscm\"" Apr 24 21:16:17.526413 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526395 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysctl-conf\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.526459 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-os-release\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.526494 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526453 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-cni-multus\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.526494 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-multus-certs\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.526550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovnkube-script-lib\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.526550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.526604 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-cni-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.526604 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526578 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-daemon-config\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.526676 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526603 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovnkube-config\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.526676 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-var-lib-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.526676 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tcgt\" (UniqueName: \"kubernetes.io/projected/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-kube-api-access-7tcgt\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.526676 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526666 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qw2\" (UniqueName: \"kubernetes.io/projected/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-kube-api-access-f5qw2\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.526822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/034136db-21a3-428a-894b-90395491da10-host-slash\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.526822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-slash\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.526822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-node-log\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.526822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526803 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-log-socket\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.526822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-cnibin\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.526990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-kubernetes\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.526990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-host\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.526990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526890 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:17.526990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526911 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-os-release\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.526990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:17.526990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-modprobe-d\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.526990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.526986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-lib-modules\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vxrv\" (UniqueName: \"kubernetes.io/projected/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-kube-api-access-8vxrv\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-sys-fs\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-env-overrides\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-systemd\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-ovn\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527168 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xw77\" (UniqueName: \"kubernetes.io/projected/9ec1111b-6a43-49dd-978e-ad82b438f091-kube-api-access-5xw77\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-cni-bin\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527215 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-conf-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-systemd-units\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-system-cni-dir\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc4s8\" (UniqueName: \"kubernetes.io/projected/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-kube-api-access-hc4s8\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-sys\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-socket-dir-parent\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-k8s-cni-cncf-io\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527349 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-kubelet\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-etc-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-tmp\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-cnibin\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovn-node-metrics-cert\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/034136db-21a3-428a-894b-90395491da10-iptables-alerter-script\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wpj\" (UniqueName: \"kubernetes.io/projected/034136db-21a3-428a-894b-90395491da10-kube-api-access-m7wpj\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527472 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-run-ovn-kubernetes\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527505 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-run-netns\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-cni-bin\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.527558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77098f84-e6ad-4ad0-b456-f4f82edf95bf-konnectivity-ca\") pod \"konnectivity-agent-kt72p\" (UID: \"77098f84-e6ad-4ad0-b456-f4f82edf95bf\") " pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527561 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-cni-binary-copy\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-tmp-dir\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527589 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txlk8\" (UniqueName: \"kubernetes.io/projected/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-kube-api-access-txlk8\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-systemd\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-socket-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-tuned\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-netns\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-etc-kubernetes\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77098f84-e6ad-4ad0-b456-f4f82edf95bf-agent-certs\") pod \"konnectivity-agent-kt72p\" (UID: \"77098f84-e6ad-4ad0-b456-f4f82edf95bf\") " pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527792 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-hosts-file\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlmmt\" (UniqueName: \"kubernetes.io/projected/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-kube-api-access-rlmmt\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527839 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-etc-selinux\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527881 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysconfig\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.528147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-run\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-var-lib-kubelet\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.527970 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysctl-d\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.528008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-system-cni-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.528031 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-cni-netd\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.528046 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-tuning-conf-dir\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.528060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-registration-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.528077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-device-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.528097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-cni-binary-copy\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.528116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-hostroot\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.528596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.528128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-kubelet\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.535149 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.535129 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal"] Apr 24 21:16:17.536232 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.536218 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:17.536294 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.536285 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" Apr 24 21:16:17.544431 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.544414 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal"] Apr 24 21:16:17.544507 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.544447 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:17.553669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.553655 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mqzqz" Apr 24 21:16:17.559402 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.559386 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mqzqz" Apr 24 21:16:17.582698 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.582667 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f85132ac7426286517df0d79e1a6c22.slice/crio-18e5ab9c1f725b5852b9acd8e15aac44eea4868269b62e8aa0786a2063d7c8c4 WatchSource:0}: Error finding container 18e5ab9c1f725b5852b9acd8e15aac44eea4868269b62e8aa0786a2063d7c8c4: Status 404 returned error can't find the container with id 18e5ab9c1f725b5852b9acd8e15aac44eea4868269b62e8aa0786a2063d7c8c4 Apr 24 21:16:17.582979 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.582963 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c544ccdf1879496756152eb8f8b28eb.slice/crio-ac4c9c55b43d539420449c958138e292d67854211a43f23db023a50c7a3a711b WatchSource:0}: Error finding container ac4c9c55b43d539420449c958138e292d67854211a43f23db023a50c7a3a711b: Status 404 returned error can't find the container with id ac4c9c55b43d539420449c958138e292d67854211a43f23db023a50c7a3a711b Apr 24 21:16:17.587686 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.587672 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:16:17.622409 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.622383 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:16:17.628403 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-tuned\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.628477 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-netns\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.628477 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-etc-kubernetes\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.628477 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-netns\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77098f84-e6ad-4ad0-b456-f4f82edf95bf-agent-certs\") pod \"konnectivity-agent-kt72p\" (UID: \"77098f84-e6ad-4ad0-b456-f4f82edf95bf\") " pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628517 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-etc-kubernetes\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-hosts-file\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlmmt\" (UniqueName: \"kubernetes.io/projected/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-kube-api-access-rlmmt\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-hosts-file\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-etc-selinux\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.628619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysconfig\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-run\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-var-lib-kubelet\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-etc-selinux\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628714 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-run\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628731 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysconfig\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-var-lib-kubelet\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysctl-d\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-system-cni-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh25s\" (UniqueName: \"kubernetes.io/projected/799c7f5a-9111-4e65-8973-f1d3fd28c13e-kube-api-access-gh25s\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-cni-netd\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-system-cni-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysctl-d\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-tuning-conf-dir\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-cni-netd\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-registration-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.629009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.628983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-device-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-cni-binary-copy\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-registration-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-tuning-conf-dir\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629088 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-device-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-hostroot\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-kubelet\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysctl-conf\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-os-release\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-hostroot\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-kubelet\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629244 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-cni-multus\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-cni-multus\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-os-release\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-multus-certs\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-sysctl-conf\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.629773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovnkube-script-lib\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-multus-certs\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-cni-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-daemon-config\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovnkube-config\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-cni-binary-copy\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-var-lib-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-cni-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629670 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tcgt\" (UniqueName: \"kubernetes.io/projected/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-kube-api-access-7tcgt\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qw2\" (UniqueName: \"kubernetes.io/projected/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-kube-api-access-f5qw2\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-var-lib-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/034136db-21a3-428a-894b-90395491da10-host-slash\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-slash\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-node-log\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-log-socket\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.630514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovnkube-script-lib\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-cnibin\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-slash\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-kubernetes\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-node-log\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.629973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-log-socket\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-kubernetes\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/034136db-21a3-428a-894b-90395491da10-host-slash\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-daemon-config\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630039 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-cnibin\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630060 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-host\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-host\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-os-release\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-modprobe-d\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-os-release\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.631268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-lib-modules\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vxrv\" (UniqueName: \"kubernetes.io/projected/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-kube-api-access-8vxrv\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovnkube-config\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-lib-modules\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.630372 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.630455 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs podName:371c1fec-a68a-4ff5-b5fc-29a34feb3ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:16:18.130426674 +0000 UTC m=+2.098228732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs") pod "network-metrics-daemon-bcqjb" (UID: "371c1fec-a68a-4ff5-b5fc-29a34feb3ffe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-modprobe-d\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-sys-fs\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630513 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799c7f5a-9111-4e65-8973-f1d3fd28c13e-serviceca\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630524 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-sys-fs\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-env-overrides\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-systemd\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-ovn\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-systemd\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xw77\" (UniqueName: \"kubernetes.io/projected/9ec1111b-6a43-49dd-978e-ad82b438f091-kube-api-access-5xw77\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-cni-bin\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-ovn\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.632856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-conf-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-systemd-units\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-system-cni-dir\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc4s8\" (UniqueName: \"kubernetes.io/projected/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-kube-api-access-hc4s8\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-cni-bin\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-sys\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-systemd-units\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630846 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-sys\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630785 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-conf-dir\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-socket-dir-parent\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-k8s-cni-cncf-io\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-multus-socket-dir-parent\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630904 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-env-overrides\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-run-k8s-cni-cncf-io\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630927 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ec1111b-6a43-49dd-978e-ad82b438f091-system-cni-dir\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-kubelet\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-etc-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.630998 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-tmp\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.633496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-cnibin\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-etc-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-host-var-lib-kubelet\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovn-node-metrics-cert\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/034136db-21a3-428a-894b-90395491da10-iptables-alerter-script\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-cnibin\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wpj\" (UniqueName: \"kubernetes.io/projected/034136db-21a3-428a-894b-90395491da10-kube-api-access-m7wpj\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631147 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-run-ovn-kubernetes\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799c7f5a-9111-4e65-8973-f1d3fd28c13e-host\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-run-netns\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-cni-bin\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77098f84-e6ad-4ad0-b456-f4f82edf95bf-konnectivity-ca\") pod \"konnectivity-agent-kt72p\" (UID: \"77098f84-e6ad-4ad0-b456-f4f82edf95bf\") " pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-cni-binary-copy\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-run-netns\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-run-ovn-kubernetes\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-run-openvswitch\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.633964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-tmp-dir\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631440 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/034136db-21a3-428a-894b-90395491da10-iptables-alerter-script\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txlk8\" (UniqueName: \"kubernetes.io/projected/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-kube-api-access-txlk8\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77098f84-e6ad-4ad0-b456-f4f82edf95bf-konnectivity-ca\") pod \"konnectivity-agent-kt72p\" (UID: \"77098f84-e6ad-4ad0-b456-f4f82edf95bf\") " pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-systemd\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.631955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-host-cni-bin\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.632035 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-systemd\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.632069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-socket-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.632157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-etc-tuned\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.632166 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-tmp-dir\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.632185 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-socket-dir\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.632260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77098f84-e6ad-4ad0-b456-f4f82edf95bf-agent-certs\") pod \"konnectivity-agent-kt72p\" (UID: \"77098f84-e6ad-4ad0-b456-f4f82edf95bf\") " pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.632523 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ec1111b-6a43-49dd-978e-ad82b438f091-cni-binary-copy\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.633317 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-tmp\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.634383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.633414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-ovn-node-metrics-cert\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.636944 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.636926 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:17.636944 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.636945 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:17.637054 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.636953 2578 projected.go:194] Error preparing data for projected volume kube-api-access-z2lsb for pod openshift-network-diagnostics/network-check-target-vrf9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:17.637054 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:17.637010 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb podName:6fda3b6d-a4e2-4aa3-b140-9768563e5f02 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:18.136994082 +0000 UTC m=+2.104796146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z2lsb" (UniqueName: "kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb") pod "network-check-target-vrf9q" (UID: "6fda3b6d-a4e2-4aa3-b140-9768563e5f02") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:17.639147 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.639120 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qw2\" (UniqueName: \"kubernetes.io/projected/c6a36d68-e5f6-4ff5-8bbd-95e656f22006-kube-api-access-f5qw2\") pod \"multus-vvqbk\" (UID: \"c6a36d68-e5f6-4ff5-8bbd-95e656f22006\") " pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.639466 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.639447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tcgt\" (UniqueName: \"kubernetes.io/projected/44e23ee3-f057-4ec2-bc73-ccfb6c251e9c-kube-api-access-7tcgt\") pod \"tuned-5dm7j\" (UID: \"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.639917 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.639891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wpj\" (UniqueName: \"kubernetes.io/projected/034136db-21a3-428a-894b-90395491da10-kube-api-access-m7wpj\") pod \"iptables-alerter-8qkh9\" (UID: \"034136db-21a3-428a-894b-90395491da10\") " pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.640045 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.640020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlmmt\" (UniqueName: \"kubernetes.io/projected/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-kube-api-access-rlmmt\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:17.640171 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.640149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vxrv\" (UniqueName: \"kubernetes.io/projected/e70e5f9c-8c1a-4ad0-b8e0-9f7176780519-kube-api-access-8vxrv\") pod \"ovnkube-node-49kt7\" (UID: \"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519\") " pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.640238 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.640187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xw77\" (UniqueName: \"kubernetes.io/projected/9ec1111b-6a43-49dd-978e-ad82b438f091-kube-api-access-5xw77\") pod \"multus-additional-cni-plugins-84bkj\" (UID: \"9ec1111b-6a43-49dd-978e-ad82b438f091\") " pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.640416 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.640399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txlk8\" (UniqueName: \"kubernetes.io/projected/ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd-kube-api-access-txlk8\") pod \"aws-ebs-csi-driver-node-2pqq7\" (UID: \"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.640934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.640920 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc4s8\" (UniqueName: \"kubernetes.io/projected/2a3c74ea-5d5a-4252-973d-273be9ad3ca5-kube-api-access-hc4s8\") pod \"node-resolver-pvc6x\" (UID: \"2a3c74ea-5d5a-4252-973d-273be9ad3ca5\") " pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.658119 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.658087 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" event={"ID":"6c544ccdf1879496756152eb8f8b28eb","Type":"ContainerStarted","Data":"ac4c9c55b43d539420449c958138e292d67854211a43f23db023a50c7a3a711b"} Apr 24 21:16:17.658939 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.658921 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" event={"ID":"8f85132ac7426286517df0d79e1a6c22","Type":"ContainerStarted","Data":"18e5ab9c1f725b5852b9acd8e15aac44eea4868269b62e8aa0786a2063d7c8c4"} Apr 24 21:16:17.732312 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.732258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799c7f5a-9111-4e65-8973-f1d3fd28c13e-host\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.732312 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.732296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh25s\" (UniqueName: \"kubernetes.io/projected/799c7f5a-9111-4e65-8973-f1d3fd28c13e-kube-api-access-gh25s\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.732449 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.732357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799c7f5a-9111-4e65-8973-f1d3fd28c13e-host\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.732449 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.732403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799c7f5a-9111-4e65-8973-f1d3fd28c13e-serviceca\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.732717 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.732702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799c7f5a-9111-4e65-8973-f1d3fd28c13e-serviceca\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.739313 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.739294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh25s\" (UniqueName: \"kubernetes.io/projected/799c7f5a-9111-4e65-8973-f1d3fd28c13e-kube-api-access-gh25s\") pod \"node-ca-xtf2m\" (UID: \"799c7f5a-9111-4e65-8973-f1d3fd28c13e\") " pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.836235 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.836220 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vvqbk" Apr 24 21:16:17.841853 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.841833 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6a36d68_e5f6_4ff5_8bbd_95e656f22006.slice/crio-f6d1c59577a9b9cf97dcdce9f27c1feea4e4d3fdd8bc47e7ae381278f50f5a79 WatchSource:0}: Error finding container f6d1c59577a9b9cf97dcdce9f27c1feea4e4d3fdd8bc47e7ae381278f50f5a79: Status 404 returned error can't find the container with id f6d1c59577a9b9cf97dcdce9f27c1feea4e4d3fdd8bc47e7ae381278f50f5a79 Apr 24 21:16:17.852889 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.852868 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8qkh9" Apr 24 21:16:17.858907 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.858889 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod034136db_21a3_428a_894b_90395491da10.slice/crio-e1c6afd8dbce85eb564dd79f04f21d78773b12f83b2d01b86803a79960a78c03 WatchSource:0}: Error finding container e1c6afd8dbce85eb564dd79f04f21d78773b12f83b2d01b86803a79960a78c03: Status 404 returned error can't find the container with id e1c6afd8dbce85eb564dd79f04f21d78773b12f83b2d01b86803a79960a78c03 Apr 24 21:16:17.871827 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.871809 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:17.876892 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.876875 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode70e5f9c_8c1a_4ad0_b8e0_9f7176780519.slice/crio-c00e0437f884ec4cc6959fc436583039547e41de8e0522f3699b66118db7ae97 WatchSource:0}: Error finding container c00e0437f884ec4cc6959fc436583039547e41de8e0522f3699b66118db7ae97: Status 404 returned error can't find the container with id c00e0437f884ec4cc6959fc436583039547e41de8e0522f3699b66118db7ae97 Apr 24 21:16:17.889515 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.889500 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:17.895170 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.895155 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pvc6x" Apr 24 21:16:17.895338 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.895312 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77098f84_e6ad_4ad0_b456_f4f82edf95bf.slice/crio-de63de3226e933e4dbdba35ecf62441999f51cd4e7e47251356bcd0a54601ed7 WatchSource:0}: Error finding container de63de3226e933e4dbdba35ecf62441999f51cd4e7e47251356bcd0a54601ed7: Status 404 returned error can't find the container with id de63de3226e933e4dbdba35ecf62441999f51cd4e7e47251356bcd0a54601ed7 Apr 24 21:16:17.900128 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.900112 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-84bkj" Apr 24 21:16:17.900832 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.900815 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a3c74ea_5d5a_4252_973d_273be9ad3ca5.slice/crio-a64190dc0203d3fe53386a351b8d6c7522d04909e9c7fad52f5016d7fb42e6ac WatchSource:0}: Error finding container a64190dc0203d3fe53386a351b8d6c7522d04909e9c7fad52f5016d7fb42e6ac: Status 404 returned error can't find the container with id a64190dc0203d3fe53386a351b8d6c7522d04909e9c7fad52f5016d7fb42e6ac Apr 24 21:16:17.905860 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.905843 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" Apr 24 21:16:17.906209 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.906178 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec1111b_6a43_49dd_978e_ad82b438f091.slice/crio-eecd58a945e5c0825f4a3b0279f99679cf424a57b723804cd1a15ff86be82290 WatchSource:0}: Error finding container eecd58a945e5c0825f4a3b0279f99679cf424a57b723804cd1a15ff86be82290: Status 404 returned error can't find the container with id eecd58a945e5c0825f4a3b0279f99679cf424a57b723804cd1a15ff86be82290 Apr 24 21:16:17.911540 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.911517 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea7ad1d5_1d87_404a_9db5_1c6f3d12f7bd.slice/crio-a102ff66b3ee6bd605ed48318fff76b40d732ee0a962bdbc0dc10a040b9ab5f0 WatchSource:0}: Error finding container a102ff66b3ee6bd605ed48318fff76b40d732ee0a962bdbc0dc10a040b9ab5f0: Status 404 returned error can't find the container with id a102ff66b3ee6bd605ed48318fff76b40d732ee0a962bdbc0dc10a040b9ab5f0 Apr 24 21:16:17.912197 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.912182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" Apr 24 21:16:17.916466 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:17.916419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xtf2m" Apr 24 21:16:17.918437 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.918414 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e23ee3_f057_4ec2_bc73_ccfb6c251e9c.slice/crio-8f8deeaf15a78a24f6a27b316b2a15df30bb6392be016c4095c892e9f084e678 WatchSource:0}: Error finding container 8f8deeaf15a78a24f6a27b316b2a15df30bb6392be016c4095c892e9f084e678: Status 404 returned error can't find the container with id 8f8deeaf15a78a24f6a27b316b2a15df30bb6392be016c4095c892e9f084e678 Apr 24 21:16:17.924263 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:17.924242 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod799c7f5a_9111_4e65_8973_f1d3fd28c13e.slice/crio-3093619c37d52669af6c062d5a49b0f020259fd74490107d7602b22d2b2572a2 WatchSource:0}: Error finding container 3093619c37d52669af6c062d5a49b0f020259fd74490107d7602b22d2b2572a2: Status 404 returned error can't find the container with id 3093619c37d52669af6c062d5a49b0f020259fd74490107d7602b22d2b2572a2 Apr 24 21:16:18.135316 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.135233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:18.135449 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:18.135371 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:18.135449 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:18.135447 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs podName:371c1fec-a68a-4ff5-b5fc-29a34feb3ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:16:19.135427084 +0000 UTC m=+3.103229129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs") pod "network-metrics-daemon-bcqjb" (UID: "371c1fec-a68a-4ff5-b5fc-29a34feb3ffe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:18.236438 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.236390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:18.236624 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:18.236608 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:18.236688 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:18.236631 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:18.236688 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:18.236643 2578 projected.go:194] Error preparing data for projected volume kube-api-access-z2lsb for pod openshift-network-diagnostics/network-check-target-vrf9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:18.236824 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:18.236699 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb podName:6fda3b6d-a4e2-4aa3-b140-9768563e5f02 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:19.236680594 +0000 UTC m=+3.204482649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2lsb" (UniqueName: "kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb") pod "network-check-target-vrf9q" (UID: "6fda3b6d-a4e2-4aa3-b140-9768563e5f02") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:18.266017 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.265336 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:18.560107 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.560026 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:17 +0000 UTC" deadline="2027-12-30 08:33:12.497375184 +0000 UTC" Apr 24 21:16:18.560107 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.560063 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14747h16m53.93731632s" Apr 24 21:16:18.658085 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.655812 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:18.658085 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:18.655939 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:18.675637 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.675606 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" event={"ID":"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c","Type":"ContainerStarted","Data":"8f8deeaf15a78a24f6a27b316b2a15df30bb6392be016c4095c892e9f084e678"} Apr 24 21:16:18.694900 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.694870 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pvc6x" event={"ID":"2a3c74ea-5d5a-4252-973d-273be9ad3ca5","Type":"ContainerStarted","Data":"a64190dc0203d3fe53386a351b8d6c7522d04909e9c7fad52f5016d7fb42e6ac"} Apr 24 21:16:18.705020 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.704980 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kt72p" event={"ID":"77098f84-e6ad-4ad0-b456-f4f82edf95bf","Type":"ContainerStarted","Data":"de63de3226e933e4dbdba35ecf62441999f51cd4e7e47251356bcd0a54601ed7"} Apr 24 21:16:18.713138 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.713113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8qkh9" event={"ID":"034136db-21a3-428a-894b-90395491da10","Type":"ContainerStarted","Data":"e1c6afd8dbce85eb564dd79f04f21d78773b12f83b2d01b86803a79960a78c03"} Apr 24 21:16:18.719737 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.719711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xtf2m" event={"ID":"799c7f5a-9111-4e65-8973-f1d3fd28c13e","Type":"ContainerStarted","Data":"3093619c37d52669af6c062d5a49b0f020259fd74490107d7602b22d2b2572a2"} Apr 24 21:16:18.726326 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.726301 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" event={"ID":"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd","Type":"ContainerStarted","Data":"a102ff66b3ee6bd605ed48318fff76b40d732ee0a962bdbc0dc10a040b9ab5f0"} Apr 24 21:16:18.753525 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.753499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerStarted","Data":"eecd58a945e5c0825f4a3b0279f99679cf424a57b723804cd1a15ff86be82290"} Apr 24 21:16:18.757926 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.757824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"c00e0437f884ec4cc6959fc436583039547e41de8e0522f3699b66118db7ae97"} Apr 24 21:16:18.769222 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.769200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vvqbk" event={"ID":"c6a36d68-e5f6-4ff5-8bbd-95e656f22006","Type":"ContainerStarted","Data":"f6d1c59577a9b9cf97dcdce9f27c1feea4e4d3fdd8bc47e7ae381278f50f5a79"} Apr 24 21:16:18.868543 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:18.868441 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:19.144276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:19.144198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:19.144425 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:19.144362 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:19.144425 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:19.144420 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs podName:371c1fec-a68a-4ff5-b5fc-29a34feb3ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:16:21.144400734 +0000 UTC m=+5.112202785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs") pod "network-metrics-daemon-bcqjb" (UID: "371c1fec-a68a-4ff5-b5fc-29a34feb3ffe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:19.244512 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:19.244479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:19.244675 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:19.244660 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:19.244736 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:19.244679 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:19.244736 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:19.244691 2578 projected.go:194] Error preparing data for projected volume kube-api-access-z2lsb for pod openshift-network-diagnostics/network-check-target-vrf9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:19.244864 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:19.244740 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb podName:6fda3b6d-a4e2-4aa3-b140-9768563e5f02 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:21.244722734 +0000 UTC m=+5.212524781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2lsb" (UniqueName: "kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb") pod "network-check-target-vrf9q" (UID: "6fda3b6d-a4e2-4aa3-b140-9768563e5f02") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:19.560876 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:19.560703 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:17 +0000 UTC" deadline="2028-01-15 18:37:08.504834546 +0000 UTC" Apr 24 21:16:19.560876 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:19.560738 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15141h20m48.94409985s" Apr 24 21:16:19.656111 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:19.656076 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:19.656283 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:19.656207 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:19.796652 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:19.796617 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:20.657994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:20.657965 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:20.658423 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:20.658070 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:21.158311 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:21.158282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:21.158480 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:21.158457 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:21.158627 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:21.158533 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs podName:371c1fec-a68a-4ff5-b5fc-29a34feb3ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:16:25.158512924 +0000 UTC m=+9.126314973 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs") pod "network-metrics-daemon-bcqjb" (UID: "371c1fec-a68a-4ff5-b5fc-29a34feb3ffe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:21.260603 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:21.259118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:21.260603 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:21.259288 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:21.260603 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:21.259309 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:21.260603 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:21.259322 2578 projected.go:194] Error preparing data for projected volume kube-api-access-z2lsb for pod openshift-network-diagnostics/network-check-target-vrf9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:21.260603 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:21.259379 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb podName:6fda3b6d-a4e2-4aa3-b140-9768563e5f02 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:25.259358286 +0000 UTC m=+9.227160330 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2lsb" (UniqueName: "kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb") pod "network-check-target-vrf9q" (UID: "6fda3b6d-a4e2-4aa3-b140-9768563e5f02") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:21.655548 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:21.655478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:21.655715 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:21.655674 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:22.655938 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:22.655325 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:22.655938 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:22.655446 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:23.655493 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:23.655454 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:23.655698 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:23.655672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:24.656281 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:24.655882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:24.656281 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:24.655987 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:25.192442 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:25.192409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:25.192622 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:25.192587 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:25.192685 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:25.192658 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs podName:371c1fec-a68a-4ff5-b5fc-29a34feb3ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:16:33.192635996 +0000 UTC m=+17.160438049 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs") pod "network-metrics-daemon-bcqjb" (UID: "371c1fec-a68a-4ff5-b5fc-29a34feb3ffe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:25.293307 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:25.293260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:25.293461 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:25.293435 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:25.293461 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:25.293455 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:25.293543 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:25.293468 2578 projected.go:194] Error preparing data for projected volume kube-api-access-z2lsb for pod openshift-network-diagnostics/network-check-target-vrf9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:25.293543 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:25.293525 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb podName:6fda3b6d-a4e2-4aa3-b140-9768563e5f02 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:33.293506754 +0000 UTC m=+17.261308802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2lsb" (UniqueName: "kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb") pod "network-check-target-vrf9q" (UID: "6fda3b6d-a4e2-4aa3-b140-9768563e5f02") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:25.656354 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:25.655878 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:25.656354 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:25.656003 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:26.656712 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:26.656681 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:26.657160 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:26.656815 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:27.655684 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:27.655657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:27.655849 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:27.655803 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:28.655801 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:28.655771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:28.656195 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:28.655893 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:29.655911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:29.655881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:29.656317 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:29.655998 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:30.656062 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:30.656030 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:30.656488 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:30.656158 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:31.655794 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:31.655544 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:31.655943 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:31.655855 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:32.655396 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:32.655370 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:32.655728 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:32.655462 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:33.254560 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:33.254521 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:33.254821 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:33.254662 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:33.254821 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:33.254721 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs podName:371c1fec-a68a-4ff5-b5fc-29a34feb3ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.254705377 +0000 UTC m=+33.222507422 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs") pod "network-metrics-daemon-bcqjb" (UID: "371c1fec-a68a-4ff5-b5fc-29a34feb3ffe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:33.355806 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:33.355774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:33.355950 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:33.355912 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:33.355950 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:33.355934 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:33.355950 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:33.355949 2578 projected.go:194] Error preparing data for projected volume kube-api-access-z2lsb for pod openshift-network-diagnostics/network-check-target-vrf9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:33.356081 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:33.356000 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb podName:6fda3b6d-a4e2-4aa3-b140-9768563e5f02 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.355987108 +0000 UTC m=+33.323789148 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2lsb" (UniqueName: "kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb") pod "network-check-target-vrf9q" (UID: "6fda3b6d-a4e2-4aa3-b140-9768563e5f02") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:33.655648 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:33.655571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:33.656047 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:33.655709 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:34.655924 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:34.655894 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:34.656315 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:34.656009 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:35.655706 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.655670 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:35.655882 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:35.655836 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:35.807005 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.806969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" event={"ID":"6c544ccdf1879496756152eb8f8b28eb","Type":"ContainerStarted","Data":"88d867138c7811fca76fd88c4d21cc613f351245966a10fd91207bef70ecfb3b"} Apr 24 21:16:35.809791 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.809761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" event={"ID":"44e23ee3-f057-4ec2-bc73-ccfb6c251e9c","Type":"ContainerStarted","Data":"af11f939fc04748b3d315a0d813198cf1c96be4e6398595feac08d998367bf8c"} Apr 24 21:16:35.812870 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.812820 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:16:35.813440 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.813414 2578 generic.go:358] "Generic (PLEG): container finished" podID="e70e5f9c-8c1a-4ad0-b8e0-9f7176780519" containerID="8443d8ed05742dee4f3b8b8ae6006a084bfb111eb66057e79d9dc82b2b205225" exitCode=1 Apr 24 21:16:35.813535 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.813439 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"c60ce69c9ea10109c6e73e845d35244c9eced9ce752aa85447db32993db6724c"} Apr 24 21:16:35.813535 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.813461 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"28a25dcd7714981de039162939b44460b4ede921059ec5c58ca6a3b1d109de0c"} Apr 24 21:16:35.813535 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.813477 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"7423bb2bff0ddbf68c917ee27ce890a88abbfb6b732e5b437e94496186556669"} Apr 24 21:16:35.813535 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.813496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"f1b0b517acd49be18c81faa8e08277a079140d26fbd7466dd4fd5e2e33bb223c"} Apr 24 21:16:35.813535 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.813504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerDied","Data":"8443d8ed05742dee4f3b8b8ae6006a084bfb111eb66057e79d9dc82b2b205225"} Apr 24 21:16:35.813535 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.813514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"fc5aa61ec212d2606f11b11789257a59b97b83f325f1127d547c93de6bd160a0"} Apr 24 21:16:35.814616 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.814599 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vvqbk" event={"ID":"c6a36d68-e5f6-4ff5-8bbd-95e656f22006","Type":"ContainerStarted","Data":"8153f276b4aac34121bd4d8b03cdf00dca32e495c4623b61c4cfdd40311ba92f"} Apr 24 21:16:35.848417 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.848333 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-248.ec2.internal" podStartSLOduration=18.848317107 podStartE2EDuration="18.848317107s" podCreationTimestamp="2026-04-24 21:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:35.823876564 +0000 UTC m=+19.791678627" watchObservedRunningTime="2026-04-24 21:16:35.848317107 +0000 UTC m=+19.816119170" Apr 24 21:16:35.871135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:35.871084 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vvqbk" podStartSLOduration=2.605375328 podStartE2EDuration="19.871068177s" podCreationTimestamp="2026-04-24 21:16:16 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.843356207 +0000 UTC m=+1.811158248" lastFinishedPulling="2026-04-24 21:16:35.109049051 +0000 UTC m=+19.076851097" observedRunningTime="2026-04-24 21:16:35.851610614 +0000 UTC m=+19.819412678" watchObservedRunningTime="2026-04-24 21:16:35.871068177 +0000 UTC m=+19.838870243" Apr 24 21:16:36.656095 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.655895 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:36.656267 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:36.656103 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:36.778469 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.778445 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:16:36.817710 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.817682 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xtf2m" event={"ID":"799c7f5a-9111-4e65-8973-f1d3fd28c13e","Type":"ContainerStarted","Data":"d406b17317b076865c7b8a4f008aacd88f16e4419f545bac2caa00214165546f"} Apr 24 21:16:36.819303 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.819280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" event={"ID":"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd","Type":"ContainerStarted","Data":"c16034ddb6f0ee948745e130b2369eff25ac578cd7d9631183ffc1ccda2720ae"} Apr 24 21:16:36.819386 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.819305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" event={"ID":"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd","Type":"ContainerStarted","Data":"1b6637570d4121d686ffa1abab78442dfebfca3dd9c94a764f90ca97bc93188c"} Apr 24 21:16:36.820577 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.820552 2578 generic.go:358] "Generic (PLEG): container finished" podID="9ec1111b-6a43-49dd-978e-ad82b438f091" containerID="4708c5274113e46b11cb11bcf8ec2d3fa92092a1d4b526bcb826fb31ad0465de" exitCode=0 Apr 24 21:16:36.820657 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.820625 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerDied","Data":"4708c5274113e46b11cb11bcf8ec2d3fa92092a1d4b526bcb826fb31ad0465de"} Apr 24 21:16:36.821897 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.821878 2578 generic.go:358] "Generic (PLEG): container finished" podID="8f85132ac7426286517df0d79e1a6c22" containerID="65ce4974b1280951c401a76f7e1d1c07e078357bad8bf4f173ee7c1a917b267f" exitCode=0 Apr 24 21:16:36.821970 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.821945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" event={"ID":"8f85132ac7426286517df0d79e1a6c22","Type":"ContainerDied","Data":"65ce4974b1280951c401a76f7e1d1c07e078357bad8bf4f173ee7c1a917b267f"} Apr 24 21:16:36.823290 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.823269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pvc6x" event={"ID":"2a3c74ea-5d5a-4252-973d-273be9ad3ca5","Type":"ContainerStarted","Data":"1e855a580e73e9b994d0bc723c60b95280971d3c70f7beb92b7a16c0558b0eb8"} Apr 24 21:16:36.827403 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.827378 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kt72p" event={"ID":"77098f84-e6ad-4ad0-b456-f4f82edf95bf","Type":"ContainerStarted","Data":"10963eb559ca2cee65ae56adfd5829b9ce1420b233c9064c373004d3990945f5"} Apr 24 21:16:36.828665 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.828623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8qkh9" event={"ID":"034136db-21a3-428a-894b-90395491da10","Type":"ContainerStarted","Data":"48a01ca6f7339eb7a0d3c68d790f087c85753a4e17022e4bb163d2cba2fe3aa5"} Apr 24 21:16:36.831578 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.831545 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xtf2m" podStartSLOduration=2.6890994900000003 podStartE2EDuration="19.831530945s" podCreationTimestamp="2026-04-24 21:16:17 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.928204706 +0000 UTC m=+1.896006754" lastFinishedPulling="2026-04-24 21:16:35.070636164 +0000 UTC m=+19.038438209" observedRunningTime="2026-04-24 21:16:36.831503343 +0000 UTC m=+20.799305404" watchObservedRunningTime="2026-04-24 21:16:36.831530945 +0000 UTC m=+20.799332989" Apr 24 21:16:36.832197 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.832098 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5dm7j" podStartSLOduration=2.6426571020000003 podStartE2EDuration="19.832088471s" podCreationTimestamp="2026-04-24 21:16:17 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.920229809 +0000 UTC m=+1.888031849" lastFinishedPulling="2026-04-24 21:16:35.109661169 +0000 UTC m=+19.077463218" observedRunningTime="2026-04-24 21:16:35.87083685 +0000 UTC m=+19.838638912" watchObservedRunningTime="2026-04-24 21:16:36.832088471 +0000 UTC m=+20.799890532" Apr 24 21:16:36.859069 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.859029 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8qkh9" podStartSLOduration=3.648999191 podStartE2EDuration="20.859017055s" podCreationTimestamp="2026-04-24 21:16:16 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.860591349 +0000 UTC m=+1.828393390" lastFinishedPulling="2026-04-24 21:16:35.0706092 +0000 UTC m=+19.038411254" observedRunningTime="2026-04-24 21:16:36.858815496 +0000 UTC m=+20.826617558" watchObservedRunningTime="2026-04-24 21:16:36.859017055 +0000 UTC m=+20.826819117" Apr 24 21:16:36.896384 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.896337 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kt72p" podStartSLOduration=3.721813046 podStartE2EDuration="20.896324839s" podCreationTimestamp="2026-04-24 21:16:16 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.896729176 +0000 UTC m=+1.864531216" lastFinishedPulling="2026-04-24 21:16:35.071240966 +0000 UTC m=+19.039043009" observedRunningTime="2026-04-24 21:16:36.875231608 +0000 UTC m=+20.843033669" watchObservedRunningTime="2026-04-24 21:16:36.896324839 +0000 UTC m=+20.864126902" Apr 24 21:16:36.896599 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:36.896571 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pvc6x" podStartSLOduration=3.7238209700000002 podStartE2EDuration="20.896564585s" podCreationTimestamp="2026-04-24 21:16:16 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.90260216 +0000 UTC m=+1.870404200" lastFinishedPulling="2026-04-24 21:16:35.075345775 +0000 UTC m=+19.043147815" observedRunningTime="2026-04-24 21:16:36.89589883 +0000 UTC m=+20.863700892" watchObservedRunningTime="2026-04-24 21:16:36.896564585 +0000 UTC m=+20.864366648" Apr 24 21:16:37.595852 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.595741 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:16:36.778462157Z","UUID":"8343866b-2441-40fa-bfec-2e82229d00f7","Handler":null,"Name":"","Endpoint":""} Apr 24 21:16:37.598343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.598314 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:16:37.598343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.598346 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:16:37.656158 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.656124 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:37.656303 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:37.656249 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:37.833468 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.833437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" event={"ID":"ea7ad1d5-1d87-404a-9db5-1c6f3d12f7bd","Type":"ContainerStarted","Data":"0206a9210dc9c420027c394dc8def776dbb39bd192058e9b14499205e9393dbc"} Apr 24 21:16:37.836438 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.836411 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:16:37.836788 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.836737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"9d387e7be0625b9a07339f9681a786a4b22092dc2b85216af131224e53fb6f80"} Apr 24 21:16:37.838567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.838537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" event={"ID":"8f85132ac7426286517df0d79e1a6c22","Type":"ContainerStarted","Data":"7c0b0b14c49281fa73d2433fc04a8ecde7becd234cdeb226fa697a477d1e4e22"} Apr 24 21:16:37.851905 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:37.851834 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pqq7" podStartSLOduration=1.160751707 podStartE2EDuration="20.851820971s" podCreationTimestamp="2026-04-24 21:16:17 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.912922397 +0000 UTC m=+1.880724449" lastFinishedPulling="2026-04-24 21:16:37.603991658 +0000 UTC m=+21.571793713" observedRunningTime="2026-04-24 21:16:37.851355339 +0000 UTC m=+21.819157401" watchObservedRunningTime="2026-04-24 21:16:37.851820971 +0000 UTC m=+21.819623032" Apr 24 21:16:38.506550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:38.506522 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:38.507118 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:38.507079 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:38.521351 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:38.521310 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-248.ec2.internal" podStartSLOduration=21.521297259 podStartE2EDuration="21.521297259s" podCreationTimestamp="2026-04-24 21:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:37.868890779 +0000 UTC m=+21.836692842" watchObservedRunningTime="2026-04-24 21:16:38.521297259 +0000 UTC m=+22.489099320" Apr 24 21:16:38.655841 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:38.655812 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:38.655968 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:38.655929 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:39.656179 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:39.656148 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:39.656622 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:39.656254 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:39.842452 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:39.842426 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:16:40.046025 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.045953 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:40.046609 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.046591 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kt72p" Apr 24 21:16:40.655259 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.655199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:40.655434 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:40.655334 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:40.849143 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.848526 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:16:40.849822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.849219 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"56cecaf19a11ee14690daf62aef94de1d0a57ac26fb43394da5c73ec71f8e36c"} Apr 24 21:16:40.849822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.849462 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:40.849822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.849568 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:40.849822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.849684 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:40.849822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.849723 2578 scope.go:117] "RemoveContainer" containerID="8443d8ed05742dee4f3b8b8ae6006a084bfb111eb66057e79d9dc82b2b205225" Apr 24 21:16:40.868215 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.868022 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:40.868215 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:40.868131 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:16:41.656192 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:41.656167 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:41.656334 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:41.656258 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:41.851715 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:41.851686 2578 generic.go:358] "Generic (PLEG): container finished" podID="9ec1111b-6a43-49dd-978e-ad82b438f091" containerID="23d13a65195ef0296ed1dedfc9cd13b045733a712c4a3a71b865833a86c2e4f5" exitCode=0 Apr 24 21:16:41.852107 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:41.851780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerDied","Data":"23d13a65195ef0296ed1dedfc9cd13b045733a712c4a3a71b865833a86c2e4f5"} Apr 24 21:16:41.854608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:41.854592 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:16:41.854915 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:41.854890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" event={"ID":"e70e5f9c-8c1a-4ad0-b8e0-9f7176780519","Type":"ContainerStarted","Data":"fad5e07d0e0f6843e54b3a5bbb5367ce0d891ae0347a0d9ee153b52fe156d81b"} Apr 24 21:16:41.903884 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:41.903272 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" podStartSLOduration=8.620152984 podStartE2EDuration="25.903253346s" podCreationTimestamp="2026-04-24 21:16:16 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.878185171 +0000 UTC m=+1.845987215" lastFinishedPulling="2026-04-24 21:16:35.161285537 +0000 UTC m=+19.129087577" observedRunningTime="2026-04-24 21:16:41.901686787 +0000 UTC m=+25.869488850" watchObservedRunningTime="2026-04-24 21:16:41.903253346 +0000 UTC m=+25.871055406" Apr 24 21:16:42.655522 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:42.655326 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:42.655672 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:42.655610 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:42.894451 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:42.893917 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vrf9q"] Apr 24 21:16:42.894451 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:42.894043 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:42.894451 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:42.894200 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:42.906426 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:42.906247 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bcqjb"] Apr 24 21:16:42.906426 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:42.906351 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:42.906574 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:42.906464 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:43.860374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:43.860343 2578 generic.go:358] "Generic (PLEG): container finished" podID="9ec1111b-6a43-49dd-978e-ad82b438f091" containerID="d6ad677c7f4eeda4e52abae5e4f59a048a930fc02cff4fb26ac94eb08a50843a" exitCode=0 Apr 24 21:16:43.860462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:43.860395 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerDied","Data":"d6ad677c7f4eeda4e52abae5e4f59a048a930fc02cff4fb26ac94eb08a50843a"} Apr 24 21:16:44.655420 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:44.655346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:44.655885 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:44.655467 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:44.655885 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:44.655509 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:44.655885 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:44.655593 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:45.866586 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:45.866255 2578 generic.go:358] "Generic (PLEG): container finished" podID="9ec1111b-6a43-49dd-978e-ad82b438f091" containerID="1e21fc7ef0ead76e33fec116260d8aacb043f73e5f26620f2c5dc7379f9301b7" exitCode=0 Apr 24 21:16:45.866586 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:45.866315 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerDied","Data":"1e21fc7ef0ead76e33fec116260d8aacb043f73e5f26620f2c5dc7379f9301b7"} Apr 24 21:16:46.656624 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:46.656400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:46.656872 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:46.656501 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:46.656872 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:46.656693 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrf9q" podUID="6fda3b6d-a4e2-4aa3-b140-9768563e5f02" Apr 24 21:16:46.656872 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:46.656846 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcqjb" podUID="371c1fec-a68a-4ff5-b5fc-29a34feb3ffe" Apr 24 21:16:48.335665 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.335636 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-248.ec2.internal" event="NodeReady" Apr 24 21:16:48.336119 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.335795 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:16:48.412523 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.411798 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6bsmb"] Apr 24 21:16:48.432076 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.432053 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-867cc"] Apr 24 21:16:48.432225 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.432214 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.438814 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.438714 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:16:48.438814 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.438774 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:16:48.439030 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.439014 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-txjjd\"" Apr 24 21:16:48.447455 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.447435 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6bsmb"] Apr 24 21:16:48.447548 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.447461 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-867cc"] Apr 24 21:16:48.447548 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.447542 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:48.454327 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.454311 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:16:48.454460 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.454435 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:16:48.454568 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.454546 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qvvhg\"" Apr 24 21:16:48.454678 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.454626 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:16:48.567336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.567306 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:48.567336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.567341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33cc3935-f9cc-4484-ba91-4c3e16828c08-tmp-dir\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.567584 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.567375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w4v9\" (UniqueName: \"kubernetes.io/projected/5a97d756-3ff0-4986-bf6f-582a917fdc0a-kube-api-access-5w4v9\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:48.567584 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.567460 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.567584 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.567494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjlt\" (UniqueName: \"kubernetes.io/projected/33cc3935-f9cc-4484-ba91-4c3e16828c08-kube-api-access-kzjlt\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.567584 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.567521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33cc3935-f9cc-4484-ba91-4c3e16828c08-config-volume\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.655865 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.655833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:48.656040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.655839 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:48.658511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.658488 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:16:48.658619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.658541 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2rwk\"" Apr 24 21:16:48.658655 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.658495 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:16:48.658828 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.658780 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m7v99\"" Apr 24 21:16:48.658961 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.658827 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:16:48.670105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.668980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:48.670105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.669022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33cc3935-f9cc-4484-ba91-4c3e16828c08-tmp-dir\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.670105 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:48.669164 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:48.670105 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:48.669288 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert podName:5a97d756-3ff0-4986-bf6f-582a917fdc0a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.169268556 +0000 UTC m=+33.137070611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert") pod "ingress-canary-867cc" (UID: "5a97d756-3ff0-4986-bf6f-582a917fdc0a") : secret "canary-serving-cert" not found Apr 24 21:16:48.670105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.669702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33cc3935-f9cc-4484-ba91-4c3e16828c08-tmp-dir\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.670105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.669061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w4v9\" (UniqueName: \"kubernetes.io/projected/5a97d756-3ff0-4986-bf6f-582a917fdc0a-kube-api-access-5w4v9\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:48.670508 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.670136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.670508 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.670274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjlt\" (UniqueName: \"kubernetes.io/projected/33cc3935-f9cc-4484-ba91-4c3e16828c08-kube-api-access-kzjlt\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.670508 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.670453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33cc3935-f9cc-4484-ba91-4c3e16828c08-config-volume\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.671115 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.671093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33cc3935-f9cc-4484-ba91-4c3e16828c08-config-volume\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.671223 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:48.671210 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:48.671283 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:48.671273 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls podName:33cc3935-f9cc-4484-ba91-4c3e16828c08 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.1712468 +0000 UTC m=+33.139048842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls") pod "dns-default-6bsmb" (UID: "33cc3935-f9cc-4484-ba91-4c3e16828c08") : secret "dns-default-metrics-tls" not found Apr 24 21:16:48.680758 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.680720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjlt\" (UniqueName: \"kubernetes.io/projected/33cc3935-f9cc-4484-ba91-4c3e16828c08-kube-api-access-kzjlt\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:48.680866 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:48.680793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w4v9\" (UniqueName: \"kubernetes.io/projected/5a97d756-3ff0-4986-bf6f-582a917fdc0a-kube-api-access-5w4v9\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:49.173637 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:49.173597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:49.173825 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:49.173675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:49.173825 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:49.173777 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:49.173931 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:49.173834 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:49.173931 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:49.173850 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert podName:5a97d756-3ff0-4986-bf6f-582a917fdc0a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:50.173829505 +0000 UTC m=+34.141631551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert") pod "ingress-canary-867cc" (UID: "5a97d756-3ff0-4986-bf6f-582a917fdc0a") : secret "canary-serving-cert" not found Apr 24 21:16:49.173931 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:49.173891 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls podName:33cc3935-f9cc-4484-ba91-4c3e16828c08 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:50.173869778 +0000 UTC m=+34.141671819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls") pod "dns-default-6bsmb" (UID: "33cc3935-f9cc-4484-ba91-4c3e16828c08") : secret "dns-default-metrics-tls" not found Apr 24 21:16:49.274954 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:49.274931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:16:49.275069 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:49.275011 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:16:49.275069 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:49.275048 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs podName:371c1fec-a68a-4ff5-b5fc-29a34feb3ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:17:21.275038298 +0000 UTC m=+65.242840342 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs") pod "network-metrics-daemon-bcqjb" (UID: "371c1fec-a68a-4ff5-b5fc-29a34feb3ffe") : secret "metrics-daemon-secret" not found Apr 24 21:16:49.376057 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:49.376024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:49.378927 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:49.378900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2lsb\" (UniqueName: \"kubernetes.io/projected/6fda3b6d-a4e2-4aa3-b140-9768563e5f02-kube-api-access-z2lsb\") pod \"network-check-target-vrf9q\" (UID: \"6fda3b6d-a4e2-4aa3-b140-9768563e5f02\") " pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:49.566847 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:49.566773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:50.181310 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:50.181273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:50.181497 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:50.181331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:50.181497 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:50.181428 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:50.181497 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:50.181427 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:50.181497 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:50.181486 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls podName:33cc3935-f9cc-4484-ba91-4c3e16828c08 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.181469378 +0000 UTC m=+36.149271427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls") pod "dns-default-6bsmb" (UID: "33cc3935-f9cc-4484-ba91-4c3e16828c08") : secret "dns-default-metrics-tls" not found Apr 24 21:16:50.181497 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:50.181501 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert podName:5a97d756-3ff0-4986-bf6f-582a917fdc0a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:52.181493983 +0000 UTC m=+36.149296023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert") pod "ingress-canary-867cc" (UID: "5a97d756-3ff0-4986-bf6f-582a917fdc0a") : secret "canary-serving-cert" not found Apr 24 21:16:51.492711 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:51.492577 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vrf9q"] Apr 24 21:16:51.592330 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:51.592299 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fda3b6d_a4e2_4aa3_b140_9768563e5f02.slice/crio-4a8f2d6e61c468449e68e6fa13c54117fd5c9cf5306e9d497dc873dc3e56bf2e WatchSource:0}: Error finding container 4a8f2d6e61c468449e68e6fa13c54117fd5c9cf5306e9d497dc873dc3e56bf2e: Status 404 returned error can't find the container with id 4a8f2d6e61c468449e68e6fa13c54117fd5c9cf5306e9d497dc873dc3e56bf2e Apr 24 21:16:51.880310 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:51.880226 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerStarted","Data":"c70559ca9c79f39fdb159116e970fba24a3419ced3f940897d07281686630da2"} Apr 24 21:16:51.881234 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:51.881211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vrf9q" event={"ID":"6fda3b6d-a4e2-4aa3-b140-9768563e5f02","Type":"ContainerStarted","Data":"4a8f2d6e61c468449e68e6fa13c54117fd5c9cf5306e9d497dc873dc3e56bf2e"} Apr 24 21:16:52.197557 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:52.197507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:52.197727 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:52.197569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:52.197727 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:52.197650 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:52.197727 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:52.197673 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:52.197727 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:52.197717 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert podName:5a97d756-3ff0-4986-bf6f-582a917fdc0a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.197699315 +0000 UTC m=+40.165501368 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert") pod "ingress-canary-867cc" (UID: "5a97d756-3ff0-4986-bf6f-582a917fdc0a") : secret "canary-serving-cert" not found Apr 24 21:16:52.197939 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:52.197793 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls podName:33cc3935-f9cc-4484-ba91-4c3e16828c08 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.197770459 +0000 UTC m=+40.165572503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls") pod "dns-default-6bsmb" (UID: "33cc3935-f9cc-4484-ba91-4c3e16828c08") : secret "dns-default-metrics-tls" not found Apr 24 21:16:52.886312 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:52.886281 2578 generic.go:358] "Generic (PLEG): container finished" podID="9ec1111b-6a43-49dd-978e-ad82b438f091" containerID="c70559ca9c79f39fdb159116e970fba24a3419ced3f940897d07281686630da2" exitCode=0 Apr 24 21:16:52.886810 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:52.886342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerDied","Data":"c70559ca9c79f39fdb159116e970fba24a3419ced3f940897d07281686630da2"} Apr 24 21:16:53.892387 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:53.892352 2578 generic.go:358] "Generic (PLEG): container finished" podID="9ec1111b-6a43-49dd-978e-ad82b438f091" containerID="8ebdf5a6f6e6c79d764cd7d98fdb7b9730553eb98ba07166f685ff5c601fbe71" exitCode=0 Apr 24 21:16:53.892881 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:53.892400 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerDied","Data":"8ebdf5a6f6e6c79d764cd7d98fdb7b9730553eb98ba07166f685ff5c601fbe71"} Apr 24 21:16:54.896597 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:54.896523 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84bkj" event={"ID":"9ec1111b-6a43-49dd-978e-ad82b438f091","Type":"ContainerStarted","Data":"c34ecfca6ae2cb927f8c82e526e20f136432b7dcd2f4e29bb2b1ed776df9b85b"} Apr 24 21:16:54.897653 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:54.897632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vrf9q" event={"ID":"6fda3b6d-a4e2-4aa3-b140-9768563e5f02","Type":"ContainerStarted","Data":"aa834beda1d1f12404a4594d734eeb2501abe6ea3f473bc4065b8b7c15692a44"} Apr 24 21:16:54.897779 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:54.897768 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:16:54.922701 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:54.922624 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-84bkj" podStartSLOduration=4.185714485 podStartE2EDuration="37.922611233s" podCreationTimestamp="2026-04-24 21:16:17 +0000 UTC" firstStartedPulling="2026-04-24 21:16:17.907647333 +0000 UTC m=+1.875449373" lastFinishedPulling="2026-04-24 21:16:51.644544081 +0000 UTC m=+35.612346121" observedRunningTime="2026-04-24 21:16:54.921953406 +0000 UTC m=+38.889755468" watchObservedRunningTime="2026-04-24 21:16:54.922611233 +0000 UTC m=+38.890413295" Apr 24 21:16:54.940513 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:54.940476 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vrf9q" podStartSLOduration=35.915144316 podStartE2EDuration="38.940465369s" podCreationTimestamp="2026-04-24 21:16:16 +0000 UTC" firstStartedPulling="2026-04-24 21:16:51.622837395 +0000 UTC m=+35.590639451" lastFinishedPulling="2026-04-24 21:16:54.648158456 +0000 UTC m=+38.615960504" observedRunningTime="2026-04-24 21:16:54.938985472 +0000 UTC m=+38.906787534" watchObservedRunningTime="2026-04-24 21:16:54.940465369 +0000 UTC m=+38.908267430" Apr 24 21:16:56.227830 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:56.227790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:16:56.227830 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:56.227842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:16:56.228265 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:56.227900 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:56.228265 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:56.227907 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:56.228265 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:56.227954 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert podName:5a97d756-3ff0-4986-bf6f-582a917fdc0a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:04.227940433 +0000 UTC m=+48.195742474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert") pod "ingress-canary-867cc" (UID: "5a97d756-3ff0-4986-bf6f-582a917fdc0a") : secret "canary-serving-cert" not found Apr 24 21:16:56.228265 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:16:56.227967 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls podName:33cc3935-f9cc-4484-ba91-4c3e16828c08 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:04.227961143 +0000 UTC m=+48.195763183 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls") pod "dns-default-6bsmb" (UID: "33cc3935-f9cc-4484-ba91-4c3e16828c08") : secret "dns-default-metrics-tls" not found Apr 24 21:16:58.037814 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.037783 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c"] Apr 24 21:16:58.041931 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.041916 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.044512 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.044493 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:16:58.045285 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.045270 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:16:58.045336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.045278 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:16:58.045472 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.045453 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:16:58.054063 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.054045 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c"] Apr 24 21:16:58.140231 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.140212 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a83f934c-58c9-422b-bb59-9f9af6e5d81b-klusterlet-config\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.140325 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.140246 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a83f934c-58c9-422b-bb59-9f9af6e5d81b-tmp\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.140325 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.140264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbx4v\" (UniqueName: \"kubernetes.io/projected/a83f934c-58c9-422b-bb59-9f9af6e5d81b-kube-api-access-jbx4v\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.241530 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.241509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a83f934c-58c9-422b-bb59-9f9af6e5d81b-klusterlet-config\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.241612 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.241547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a83f934c-58c9-422b-bb59-9f9af6e5d81b-tmp\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.241612 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.241564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbx4v\" (UniqueName: \"kubernetes.io/projected/a83f934c-58c9-422b-bb59-9f9af6e5d81b-kube-api-access-jbx4v\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.241962 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.241944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a83f934c-58c9-422b-bb59-9f9af6e5d81b-tmp\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.245244 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.245226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a83f934c-58c9-422b-bb59-9f9af6e5d81b-klusterlet-config\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.257410 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.257385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbx4v\" (UniqueName: \"kubernetes.io/projected/a83f934c-58c9-422b-bb59-9f9af6e5d81b-kube-api-access-jbx4v\") pod \"klusterlet-addon-workmgr-f646fbf98-7bw2c\" (UID: \"a83f934c-58c9-422b-bb59-9f9af6e5d81b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.350726 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.350671 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:16:58.460468 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.460441 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c"] Apr 24 21:16:58.463254 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:16:58.463223 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda83f934c_58c9_422b_bb59_9f9af6e5d81b.slice/crio-90f7af2085f6e063bfe602ea60e82a7262eefb2aaf501addfcb36038f1d99535 WatchSource:0}: Error finding container 90f7af2085f6e063bfe602ea60e82a7262eefb2aaf501addfcb36038f1d99535: Status 404 returned error can't find the container with id 90f7af2085f6e063bfe602ea60e82a7262eefb2aaf501addfcb36038f1d99535 Apr 24 21:16:58.905380 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:16:58.905347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" event={"ID":"a83f934c-58c9-422b-bb59-9f9af6e5d81b","Type":"ContainerStarted","Data":"90f7af2085f6e063bfe602ea60e82a7262eefb2aaf501addfcb36038f1d99535"} Apr 24 21:17:02.914286 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:02.914246 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" event={"ID":"a83f934c-58c9-422b-bb59-9f9af6e5d81b","Type":"ContainerStarted","Data":"f4201c09516236658cfffb964c2a1138022cd26b88572137f75b5eb4a8157c36"} Apr 24 21:17:02.914668 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:02.914487 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:17:02.916105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:02.916086 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" Apr 24 21:17:02.933160 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:02.933120 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f646fbf98-7bw2c" podStartSLOduration=1.1410254260000001 podStartE2EDuration="4.933108001s" podCreationTimestamp="2026-04-24 21:16:58 +0000 UTC" firstStartedPulling="2026-04-24 21:16:58.464940363 +0000 UTC m=+42.432742403" lastFinishedPulling="2026-04-24 21:17:02.25702292 +0000 UTC m=+46.224824978" observedRunningTime="2026-04-24 21:17:02.932951665 +0000 UTC m=+46.900753728" watchObservedRunningTime="2026-04-24 21:17:02.933108001 +0000 UTC m=+46.900910058" Apr 24 21:17:04.286910 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:04.286871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:17:04.287315 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:04.286922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:17:04.287315 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:04.287007 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:04.287315 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:04.287034 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:04.287315 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:04.287081 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert podName:5a97d756-3ff0-4986-bf6f-582a917fdc0a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:20.287066901 +0000 UTC m=+64.254868941 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert") pod "ingress-canary-867cc" (UID: "5a97d756-3ff0-4986-bf6f-582a917fdc0a") : secret "canary-serving-cert" not found Apr 24 21:17:04.287315 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:04.287095 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls podName:33cc3935-f9cc-4484-ba91-4c3e16828c08 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:20.287089869 +0000 UTC m=+64.254891909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls") pod "dns-default-6bsmb" (UID: "33cc3935-f9cc-4484-ba91-4c3e16828c08") : secret "dns-default-metrics-tls" not found Apr 24 21:17:12.903969 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:12.903939 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-49kt7" Apr 24 21:17:20.289266 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:20.289234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:17:20.289266 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:20.289282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:17:20.289733 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:20.289385 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:20.289733 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:20.289468 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert podName:5a97d756-3ff0-4986-bf6f-582a917fdc0a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:52.289453702 +0000 UTC m=+96.257255745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert") pod "ingress-canary-867cc" (UID: "5a97d756-3ff0-4986-bf6f-582a917fdc0a") : secret "canary-serving-cert" not found Apr 24 21:17:20.289733 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:20.289385 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:20.289733 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:20.289542 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls podName:33cc3935-f9cc-4484-ba91-4c3e16828c08 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:52.289524863 +0000 UTC m=+96.257326908 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls") pod "dns-default-6bsmb" (UID: "33cc3935-f9cc-4484-ba91-4c3e16828c08") : secret "dns-default-metrics-tls" not found Apr 24 21:17:21.295857 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:21.295822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:17:21.296219 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:21.295939 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:17:21.296219 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:21.295989 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs podName:371c1fec-a68a-4ff5-b5fc-29a34feb3ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:18:25.295976043 +0000 UTC m=+129.263778083 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs") pod "network-metrics-daemon-bcqjb" (UID: "371c1fec-a68a-4ff5-b5fc-29a34feb3ffe") : secret "metrics-daemon-secret" not found Apr 24 21:17:25.902165 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:25.902060 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vrf9q" Apr 24 21:17:29.987723 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:29.987692 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz"] Apr 24 21:17:29.990460 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:29.990445 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz" Apr 24 21:17:29.992647 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:29.992625 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:17:29.993369 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:29.993349 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:29.993460 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:29.993419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-fq8r2\"" Apr 24 21:17:30.003599 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.003576 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz"] Apr 24 21:17:30.091077 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.091053 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-d76d9f5d-zjqpm"] Apr 24 21:17:30.093797 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.093782 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.096032 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.096008 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:17:30.096118 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.096067 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:17:30.096212 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.096196 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:17:30.096262 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.096236 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:17:30.096406 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.096391 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:17:30.096564 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.096551 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hvsrb\"" Apr 24 21:17:30.096603 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.096566 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:17:30.103330 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.103313 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d76d9f5d-zjqpm"] Apr 24 21:17:30.152395 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.152376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxqd\" (UniqueName: \"kubernetes.io/projected/549de488-07e9-4da0-aa3b-352b0762cb06-kube-api-access-ztxqd\") pod \"volume-data-source-validator-7c6cbb6c87-pmdfz\" (UID: \"549de488-07e9-4da0-aa3b-352b0762cb06\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz" Apr 24 21:17:30.252901 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.252841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.252901 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.252877 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.253012 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.252914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-default-certificate\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.253012 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.252954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxqd\" (UniqueName: \"kubernetes.io/projected/549de488-07e9-4da0-aa3b-352b0762cb06-kube-api-access-ztxqd\") pod \"volume-data-source-validator-7c6cbb6c87-pmdfz\" (UID: \"549de488-07e9-4da0-aa3b-352b0762cb06\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz" Apr 24 21:17:30.253076 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.253014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-stats-auth\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.253107 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.253073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78cpj\" (UniqueName: \"kubernetes.io/projected/98762743-4e7e-44a9-878c-f6893dcf9c44-kube-api-access-78cpj\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.262701 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.262683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxqd\" (UniqueName: \"kubernetes.io/projected/549de488-07e9-4da0-aa3b-352b0762cb06-kube-api-access-ztxqd\") pod \"volume-data-source-validator-7c6cbb6c87-pmdfz\" (UID: \"549de488-07e9-4da0-aa3b-352b0762cb06\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz" Apr 24 21:17:30.298507 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.298485 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz" Apr 24 21:17:30.306909 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.306884 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd"] Apr 24 21:17:30.309953 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.309932 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-bd77785df-clsxr"] Apr 24 21:17:30.310105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.310079 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.313067 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.313048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.314766 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.314732 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:17:30.314952 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.314937 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:17:30.315173 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.315159 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bpz6p\"" Apr 24 21:17:30.315230 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.315158 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:30.315618 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.315602 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:17:30.315668 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.315639 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:17:30.316020 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.316000 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:17:30.316587 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.316567 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:17:30.316854 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.316823 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bc8qn\"" Apr 24 21:17:30.323381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.323361 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd"] Apr 24 21:17:30.326408 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.326390 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:17:30.329731 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.329711 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bd77785df-clsxr"] Apr 24 21:17:30.354055 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.353592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.354055 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.353634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.354055 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.353679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-default-certificate\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.354055 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.353708 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-stats-auth\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.354055 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.353777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78cpj\" (UniqueName: \"kubernetes.io/projected/98762743-4e7e-44a9-878c-f6893dcf9c44-kube-api-access-78cpj\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.354055 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.353883 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:17:30.354055 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.353947 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:30.853928171 +0000 UTC m=+74.821730228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : secret "router-metrics-certs-default" not found Apr 24 21:17:30.354477 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.354304 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:30.854286369 +0000 UTC m=+74.822088428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:30.357056 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.357012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-default-certificate\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.357578 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.357552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-stats-auth\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.372343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.372301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78cpj\" (UniqueName: \"kubernetes.io/projected/98762743-4e7e-44a9-878c-f6893dcf9c44-kube-api-access-78cpj\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.423101 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.420621 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz"] Apr 24 21:17:30.423741 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:17:30.423707 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549de488_07e9_4da0_aa3b_352b0762cb06.slice/crio-2239a7e081b3b58efcbad249ba9245e45aa97e856abbf6e3db73b90f5585a795 WatchSource:0}: Error finding container 2239a7e081b3b58efcbad249ba9245e45aa97e856abbf6e3db73b90f5585a795: Status 404 returned error can't find the container with id 2239a7e081b3b58efcbad249ba9245e45aa97e856abbf6e3db73b90f5585a795 Apr 24 21:17:30.455063 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455038 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80054b0a-2a30-40a5-87a8-568c2346c169-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.455161 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455078 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2c4w\" (UniqueName: \"kubernetes.io/projected/80054b0a-2a30-40a5-87a8-568c2346c169-kube-api-access-l2c4w\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.455161 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kllxj\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-kube-api-access-kllxj\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.455276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455174 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.455276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-installation-pull-secrets\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.455276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2f78a7f-b7ec-4945-937f-606241594124-ca-trust-extracted\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.455409 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-trusted-ca\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.455409 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455318 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80054b0a-2a30-40a5-87a8-568c2346c169-config\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.455409 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-bound-sa-token\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.455409 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455374 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-registry-certificates\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.455543 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.455422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-image-registry-private-configuration\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556284 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-image-registry-private-configuration\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556284 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80054b0a-2a30-40a5-87a8-568c2346c169-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.556425 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556305 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2c4w\" (UniqueName: \"kubernetes.io/projected/80054b0a-2a30-40a5-87a8-568c2346c169-kube-api-access-l2c4w\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.556425 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kllxj\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-kube-api-access-kllxj\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556425 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556425 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-installation-pull-secrets\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556613 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2f78a7f-b7ec-4945-937f-606241594124-ca-trust-extracted\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556613 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-trusted-ca\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556613 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80054b0a-2a30-40a5-87a8-568c2346c169-config\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.556613 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-bound-sa-token\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556613 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.556558 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:30.556613 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-registry-certificates\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.556613 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.556576 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bd77785df-clsxr: secret "image-registry-tls" not found Apr 24 21:17:30.556942 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.556687 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls podName:d2f78a7f-b7ec-4945-937f-606241594124 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:31.056668005 +0000 UTC m=+75.024470059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls") pod "image-registry-bd77785df-clsxr" (UID: "d2f78a7f-b7ec-4945-937f-606241594124") : secret "image-registry-tls" not found Apr 24 21:17:30.556942 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.556853 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2f78a7f-b7ec-4945-937f-606241594124-ca-trust-extracted\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.557091 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.557069 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80054b0a-2a30-40a5-87a8-568c2346c169-config\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.557520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.557496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-registry-certificates\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.557623 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.557602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-trusted-ca\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.558674 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.558651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80054b0a-2a30-40a5-87a8-568c2346c169-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.559032 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.559013 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-image-registry-private-configuration\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.559209 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.559193 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-installation-pull-secrets\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.564609 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.564587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-bound-sa-token\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.565294 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.565275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kllxj\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-kube-api-access-kllxj\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:30.565410 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.565397 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2c4w\" (UniqueName: \"kubernetes.io/projected/80054b0a-2a30-40a5-87a8-568c2346c169-kube-api-access-l2c4w\") pod \"service-ca-operator-d6fc45fc5-pmmqd\" (UID: \"80054b0a-2a30-40a5-87a8-568c2346c169\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.620423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.620403 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" Apr 24 21:17:30.729177 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.729145 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd"] Apr 24 21:17:30.733034 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:17:30.733009 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80054b0a_2a30_40a5_87a8_568c2346c169.slice/crio-4f3a14f42bd0af387472fd06a294c43a48fc9f5709ed3633fc082bc93e7eeb29 WatchSource:0}: Error finding container 4f3a14f42bd0af387472fd06a294c43a48fc9f5709ed3633fc082bc93e7eeb29: Status 404 returned error can't find the container with id 4f3a14f42bd0af387472fd06a294c43a48fc9f5709ed3633fc082bc93e7eeb29 Apr 24 21:17:30.859350 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.858780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.859350 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.858822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:30.859350 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.858934 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:17:30.859350 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.858977 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:31.858964339 +0000 UTC m=+75.826766379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : secret "router-metrics-certs-default" not found Apr 24 21:17:30.859350 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:30.859134 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:31.859118495 +0000 UTC m=+75.826920536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:30.963052 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.963018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" event={"ID":"80054b0a-2a30-40a5-87a8-568c2346c169","Type":"ContainerStarted","Data":"4f3a14f42bd0af387472fd06a294c43a48fc9f5709ed3633fc082bc93e7eeb29"} Apr 24 21:17:30.964082 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:30.964054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz" event={"ID":"549de488-07e9-4da0-aa3b-352b0762cb06","Type":"ContainerStarted","Data":"2239a7e081b3b58efcbad249ba9245e45aa97e856abbf6e3db73b90f5585a795"} Apr 24 21:17:31.060205 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:31.060174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:31.060589 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:31.060344 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:31.060589 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:31.060364 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bd77785df-clsxr: secret "image-registry-tls" not found Apr 24 21:17:31.060589 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:31.060428 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls podName:d2f78a7f-b7ec-4945-937f-606241594124 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:32.060408051 +0000 UTC m=+76.028210096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls") pod "image-registry-bd77785df-clsxr" (UID: "d2f78a7f-b7ec-4945-937f-606241594124") : secret "image-registry-tls" not found Apr 24 21:17:31.867618 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:31.867588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:31.867618 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:31.867622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:31.867808 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:31.867737 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:33.867718115 +0000 UTC m=+77.835520155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:31.867808 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:31.867786 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:17:31.867882 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:31.867838 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:33.867823846 +0000 UTC m=+77.835625902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : secret "router-metrics-certs-default" not found Apr 24 21:17:31.967603 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:31.967516 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz" event={"ID":"549de488-07e9-4da0-aa3b-352b0762cb06","Type":"ContainerStarted","Data":"3eb92ab4968a29b6f9fe0645073a27ceb84eaee944ddcc14653833cc5b9e6fa8"} Apr 24 21:17:31.984879 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:31.984829 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pmdfz" podStartSLOduration=1.7563850730000001 podStartE2EDuration="2.984812097s" podCreationTimestamp="2026-04-24 21:17:29 +0000 UTC" firstStartedPulling="2026-04-24 21:17:30.425067638 +0000 UTC m=+74.392869678" lastFinishedPulling="2026-04-24 21:17:31.65349466 +0000 UTC m=+75.621296702" observedRunningTime="2026-04-24 21:17:31.983706268 +0000 UTC m=+75.951508514" watchObservedRunningTime="2026-04-24 21:17:31.984812097 +0000 UTC m=+75.952614159" Apr 24 21:17:32.068625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:32.068594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:32.068999 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:32.068768 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:32.068999 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:32.068789 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bd77785df-clsxr: secret "image-registry-tls" not found Apr 24 21:17:32.068999 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:32.068853 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls podName:d2f78a7f-b7ec-4945-937f-606241594124 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:34.068833473 +0000 UTC m=+78.036635516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls") pod "image-registry-bd77785df-clsxr" (UID: "d2f78a7f-b7ec-4945-937f-606241594124") : secret "image-registry-tls" not found Apr 24 21:17:32.970253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:32.970218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" event={"ID":"80054b0a-2a30-40a5-87a8-568c2346c169","Type":"ContainerStarted","Data":"30427c3f64fc0b3f3ebdb1f004363c6cb6be5acd2136578763d3212fd9261b5b"} Apr 24 21:17:33.007596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:33.007547 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" podStartSLOduration=0.959816716 podStartE2EDuration="3.007534759s" podCreationTimestamp="2026-04-24 21:17:30 +0000 UTC" firstStartedPulling="2026-04-24 21:17:30.734884044 +0000 UTC m=+74.702686084" lastFinishedPulling="2026-04-24 21:17:32.782602084 +0000 UTC m=+76.750404127" observedRunningTime="2026-04-24 21:17:33.007320185 +0000 UTC m=+76.975122257" watchObservedRunningTime="2026-04-24 21:17:33.007534759 +0000 UTC m=+76.975336820" Apr 24 21:17:33.881925 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:33.881896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:33.881925 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:33.881930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:33.882478 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:33.882023 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:17:33.882478 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:33.882042 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:37.882025777 +0000 UTC m=+81.849827821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:33.882478 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:33.882118 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:37.882110359 +0000 UTC m=+81.849912403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : secret "router-metrics-certs-default" not found Apr 24 21:17:34.083300 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:34.083273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:34.083413 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:34.083380 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:34.083413 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:34.083399 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bd77785df-clsxr: secret "image-registry-tls" not found Apr 24 21:17:34.083483 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:34.083450 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls podName:d2f78a7f-b7ec-4945-937f-606241594124 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:38.083434022 +0000 UTC m=+82.051236063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls") pod "image-registry-bd77785df-clsxr" (UID: "d2f78a7f-b7ec-4945-937f-606241594124") : secret "image-registry-tls" not found Apr 24 21:17:37.679887 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:37.679858 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pvc6x_2a3c74ea-5d5a-4252-973d-273be9ad3ca5/dns-node-resolver/0.log" Apr 24 21:17:37.908917 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:37.908886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:37.908917 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:37.908916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:37.909088 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:37.909038 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:45.90902144 +0000 UTC m=+89.876823484 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:37.909088 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:37.909038 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:17:37.909088 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:37.909074 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:45.9090679 +0000 UTC m=+89.876869944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : secret "router-metrics-certs-default" not found Apr 24 21:17:38.109632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:38.109561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:38.109781 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:38.109696 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:38.109781 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:38.109711 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bd77785df-clsxr: secret "image-registry-tls" not found Apr 24 21:17:38.109781 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:38.109775 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls podName:d2f78a7f-b7ec-4945-937f-606241594124 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:46.109761916 +0000 UTC m=+90.077563973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls") pod "image-registry-bd77785df-clsxr" (UID: "d2f78a7f-b7ec-4945-937f-606241594124") : secret "image-registry-tls" not found Apr 24 21:17:39.100097 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:39.100070 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xtf2m_799c7f5a-9111-4e65-8973-f1d3fd28c13e/node-ca/0.log" Apr 24 21:17:45.969242 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:45.969208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:45.969242 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:45.969247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:45.969719 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:17:45.969407 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle podName:98762743-4e7e-44a9-878c-f6893dcf9c44 nodeName:}" failed. No retries permitted until 2026-04-24 21:18:01.969385382 +0000 UTC m=+105.937187456 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle") pod "router-default-d76d9f5d-zjqpm" (UID: "98762743-4e7e-44a9-878c-f6893dcf9c44") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:45.971582 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:45.971564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98762743-4e7e-44a9-878c-f6893dcf9c44-metrics-certs\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:17:46.170196 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:46.170168 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:46.172435 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:46.172415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"image-registry-bd77785df-clsxr\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:46.226364 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:46.226293 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:46.364718 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:46.364688 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bd77785df-clsxr"] Apr 24 21:17:46.368286 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:17:46.368256 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f78a7f_b7ec_4945_937f_606241594124.slice/crio-680542c3f9b84d77ee04aae5d29040e76196c0ea8873c485d86c64a95da0cb62 WatchSource:0}: Error finding container 680542c3f9b84d77ee04aae5d29040e76196c0ea8873c485d86c64a95da0cb62: Status 404 returned error can't find the container with id 680542c3f9b84d77ee04aae5d29040e76196c0ea8873c485d86c64a95da0cb62 Apr 24 21:17:46.998086 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:46.998054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bd77785df-clsxr" event={"ID":"d2f78a7f-b7ec-4945-937f-606241594124","Type":"ContainerStarted","Data":"68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da"} Apr 24 21:17:46.998086 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:46.998089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bd77785df-clsxr" event={"ID":"d2f78a7f-b7ec-4945-937f-606241594124","Type":"ContainerStarted","Data":"680542c3f9b84d77ee04aae5d29040e76196c0ea8873c485d86c64a95da0cb62"} Apr 24 21:17:46.998699 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:46.998224 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:17:47.021934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:47.021886 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-bd77785df-clsxr" podStartSLOduration=17.021871926 podStartE2EDuration="17.021871926s" podCreationTimestamp="2026-04-24 21:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:47.021316364 +0000 UTC m=+90.989118426" watchObservedRunningTime="2026-04-24 21:17:47.021871926 +0000 UTC m=+90.989673989" Apr 24 21:17:52.311822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.311782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:17:52.312183 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.311850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:17:52.314162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.314130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a97d756-3ff0-4986-bf6f-582a917fdc0a-cert\") pod \"ingress-canary-867cc\" (UID: \"5a97d756-3ff0-4986-bf6f-582a917fdc0a\") " pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:17:52.314260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.314140 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33cc3935-f9cc-4484-ba91-4c3e16828c08-metrics-tls\") pod \"dns-default-6bsmb\" (UID: \"33cc3935-f9cc-4484-ba91-4c3e16828c08\") " pod="openshift-dns/dns-default-6bsmb" Apr 24 21:17:52.345876 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.345854 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-txjjd\"" Apr 24 21:17:52.354015 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.353996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6bsmb" Apr 24 21:17:52.359639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.359232 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qvvhg\"" Apr 24 21:17:52.367140 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.367121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-867cc" Apr 24 21:17:52.483455 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.483390 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6bsmb"] Apr 24 21:17:52.486158 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:17:52.486125 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33cc3935_f9cc_4484_ba91_4c3e16828c08.slice/crio-380b5c9487a694801bac6f1fb6b94cf2261626c61f59a5dd30b75f600e332fb7 WatchSource:0}: Error finding container 380b5c9487a694801bac6f1fb6b94cf2261626c61f59a5dd30b75f600e332fb7: Status 404 returned error can't find the container with id 380b5c9487a694801bac6f1fb6b94cf2261626c61f59a5dd30b75f600e332fb7 Apr 24 21:17:52.497401 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:52.497377 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-867cc"] Apr 24 21:17:52.500261 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:17:52.500241 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a97d756_3ff0_4986_bf6f_582a917fdc0a.slice/crio-9cf20a975999f7cd7251549d215890cc3378078733f211bab19cd4dd68874d2e WatchSource:0}: Error finding container 9cf20a975999f7cd7251549d215890cc3378078733f211bab19cd4dd68874d2e: Status 404 returned error can't find the container with id 9cf20a975999f7cd7251549d215890cc3378078733f211bab19cd4dd68874d2e Apr 24 21:17:53.014664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:53.014613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-867cc" event={"ID":"5a97d756-3ff0-4986-bf6f-582a917fdc0a","Type":"ContainerStarted","Data":"9cf20a975999f7cd7251549d215890cc3378078733f211bab19cd4dd68874d2e"} Apr 24 21:17:53.015688 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:53.015648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6bsmb" event={"ID":"33cc3935-f9cc-4484-ba91-4c3e16828c08","Type":"ContainerStarted","Data":"380b5c9487a694801bac6f1fb6b94cf2261626c61f59a5dd30b75f600e332fb7"} Apr 24 21:17:55.024732 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:55.024696 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-867cc" event={"ID":"5a97d756-3ff0-4986-bf6f-582a917fdc0a","Type":"ContainerStarted","Data":"05a204f7e317dbf34d4ac5711089ecd9f22d4ba2aec4e6535fee6f9ab67a55ed"} Apr 24 21:17:55.026272 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:55.026248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6bsmb" event={"ID":"33cc3935-f9cc-4484-ba91-4c3e16828c08","Type":"ContainerStarted","Data":"e290f278d4b579c9d72ebdff6d04962cbbe45f2c516eb61e23a87954caca61e2"} Apr 24 21:17:55.026383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:55.026276 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6bsmb" event={"ID":"33cc3935-f9cc-4484-ba91-4c3e16828c08","Type":"ContainerStarted","Data":"59a6b8a49f03cbf425f2633c74ad6fbad15a975c8900e924ba34a7bcc20aa585"} Apr 24 21:17:55.026383 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:55.026365 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6bsmb" Apr 24 21:17:55.045466 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:55.045428 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-867cc" podStartSLOduration=65.348216137 podStartE2EDuration="1m7.045417109s" podCreationTimestamp="2026-04-24 21:16:48 +0000 UTC" firstStartedPulling="2026-04-24 21:17:52.501800976 +0000 UTC m=+96.469603016" lastFinishedPulling="2026-04-24 21:17:54.199001944 +0000 UTC m=+98.166803988" observedRunningTime="2026-04-24 21:17:55.044735896 +0000 UTC m=+99.012537961" watchObservedRunningTime="2026-04-24 21:17:55.045417109 +0000 UTC m=+99.013219171" Apr 24 21:17:55.064067 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:17:55.063964 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6bsmb" podStartSLOduration=65.356805955 podStartE2EDuration="1m7.06395093s" podCreationTimestamp="2026-04-24 21:16:48 +0000 UTC" firstStartedPulling="2026-04-24 21:17:52.487877311 +0000 UTC m=+96.455679352" lastFinishedPulling="2026-04-24 21:17:54.195022281 +0000 UTC m=+98.162824327" observedRunningTime="2026-04-24 21:17:55.063595142 +0000 UTC m=+99.031397204" watchObservedRunningTime="2026-04-24 21:17:55.06395093 +0000 UTC m=+99.031752992" Apr 24 21:18:01.974933 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:01.974892 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:18:01.975486 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:01.975465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98762743-4e7e-44a9-878c-f6893dcf9c44-service-ca-bundle\") pod \"router-default-d76d9f5d-zjqpm\" (UID: \"98762743-4e7e-44a9-878c-f6893dcf9c44\") " pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:18:02.201483 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:02.201436 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:18:02.314890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:02.314859 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d76d9f5d-zjqpm"] Apr 24 21:18:02.317793 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:02.317767 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98762743_4e7e_44a9_878c_f6893dcf9c44.slice/crio-5039f6b22879473e3e8dcf041174c1377dbc38cb9dd0af7fd804142d939a270a WatchSource:0}: Error finding container 5039f6b22879473e3e8dcf041174c1377dbc38cb9dd0af7fd804142d939a270a: Status 404 returned error can't find the container with id 5039f6b22879473e3e8dcf041174c1377dbc38cb9dd0af7fd804142d939a270a Apr 24 21:18:03.004051 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.004016 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7"] Apr 24 21:18:03.007015 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.006989 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" Apr 24 21:18:03.011900 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.011869 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-znx9t\"" Apr 24 21:18:03.011900 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.011889 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:18:03.012087 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.011947 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:18:03.046268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.046233 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" event={"ID":"98762743-4e7e-44a9-878c-f6893dcf9c44","Type":"ContainerStarted","Data":"4789b4ee15824c9f8c993115b41ff4943941ebb2fc18d11545e347c8fdfce4a2"} Apr 24 21:18:03.046423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.046275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" event={"ID":"98762743-4e7e-44a9-878c-f6893dcf9c44","Type":"ContainerStarted","Data":"5039f6b22879473e3e8dcf041174c1377dbc38cb9dd0af7fd804142d939a270a"} Apr 24 21:18:03.052407 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.052383 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7"] Apr 24 21:18:03.077938 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.075584 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bd77785df-clsxr"] Apr 24 21:18:03.081838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.081807 2578 patch_prober.go:28] interesting pod/image-registry-bd77785df-clsxr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:18:03.081966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.081855 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-bd77785df-clsxr" podUID="d2f78a7f-b7ec-4945-937f-606241594124" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:18:03.084049 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.084027 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/22fc467d-f2cb-40ba-8129-3562ce16391d-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5t8j7\" (UID: \"22fc467d-f2cb-40ba-8129-3562ce16391d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" Apr 24 21:18:03.084153 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.084077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22fc467d-f2cb-40ba-8129-3562ce16391d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5t8j7\" (UID: \"22fc467d-f2cb-40ba-8129-3562ce16391d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" Apr 24 21:18:03.114649 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.114608 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" podStartSLOduration=33.114595061 podStartE2EDuration="33.114595061s" podCreationTimestamp="2026-04-24 21:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:18:03.112426489 +0000 UTC m=+107.080228550" watchObservedRunningTime="2026-04-24 21:18:03.114595061 +0000 UTC m=+107.082397157" Apr 24 21:18:03.130542 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.130516 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ccjvm"] Apr 24 21:18:03.133799 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.133782 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.136528 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.136508 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:18:03.137483 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.137466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:18:03.137812 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.137797 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:18:03.137812 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.137805 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hxfzx\"" Apr 24 21:18:03.137943 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.137816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:18:03.147450 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.147432 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ccjvm"] Apr 24 21:18:03.184858 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.184830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/22fc467d-f2cb-40ba-8129-3562ce16391d-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5t8j7\" (UID: \"22fc467d-f2cb-40ba-8129-3562ce16391d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" Apr 24 21:18:03.184979 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.184872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3a9aad5-ac45-4187-9590-f87a0a2157cd-crio-socket\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.184979 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.184898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3a9aad5-ac45-4187-9590-f87a0a2157cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.185090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.184983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgv5h\" (UniqueName: \"kubernetes.io/projected/a3a9aad5-ac45-4187-9590-f87a0a2157cd-kube-api-access-dgv5h\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.185090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.185017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22fc467d-f2cb-40ba-8129-3562ce16391d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5t8j7\" (UID: \"22fc467d-f2cb-40ba-8129-3562ce16391d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" Apr 24 21:18:03.185090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.185049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3a9aad5-ac45-4187-9590-f87a0a2157cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.185193 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.185098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3a9aad5-ac45-4187-9590-f87a0a2157cd-data-volume\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.185596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.185566 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/22fc467d-f2cb-40ba-8129-3562ce16391d-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5t8j7\" (UID: \"22fc467d-f2cb-40ba-8129-3562ce16391d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" Apr 24 21:18:03.187628 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.187609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22fc467d-f2cb-40ba-8129-3562ce16391d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5t8j7\" (UID: \"22fc467d-f2cb-40ba-8129-3562ce16391d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" Apr 24 21:18:03.202437 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.202420 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:18:03.204781 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.204742 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:18:03.285615 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.285540 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3a9aad5-ac45-4187-9590-f87a0a2157cd-crio-socket\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.285615 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.285573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3a9aad5-ac45-4187-9590-f87a0a2157cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.285615 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.285610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgv5h\" (UniqueName: \"kubernetes.io/projected/a3a9aad5-ac45-4187-9590-f87a0a2157cd-kube-api-access-dgv5h\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.285910 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.285642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3a9aad5-ac45-4187-9590-f87a0a2157cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.285910 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.285663 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3a9aad5-ac45-4187-9590-f87a0a2157cd-crio-socket\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.285910 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.285667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3a9aad5-ac45-4187-9590-f87a0a2157cd-data-volume\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.286048 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.286024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3a9aad5-ac45-4187-9590-f87a0a2157cd-data-volume\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.286197 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.286179 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3a9aad5-ac45-4187-9590-f87a0a2157cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.288595 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.288574 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3a9aad5-ac45-4187-9590-f87a0a2157cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.304858 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.304832 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgv5h\" (UniqueName: \"kubernetes.io/projected/a3a9aad5-ac45-4187-9590-f87a0a2157cd-kube-api-access-dgv5h\") pod \"insights-runtime-extractor-ccjvm\" (UID: \"a3a9aad5-ac45-4187-9590-f87a0a2157cd\") " pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.316763 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.316732 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" Apr 24 21:18:03.442504 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.442477 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ccjvm" Apr 24 21:18:03.460407 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.460380 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7"] Apr 24 21:18:03.463204 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:03.463178 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22fc467d_f2cb_40ba_8129_3562ce16391d.slice/crio-821b6089d2b60615af52ca8624ff390650fa09a3f5027f733505d565dec1f214 WatchSource:0}: Error finding container 821b6089d2b60615af52ca8624ff390650fa09a3f5027f733505d565dec1f214: Status 404 returned error can't find the container with id 821b6089d2b60615af52ca8624ff390650fa09a3f5027f733505d565dec1f214 Apr 24 21:18:03.567287 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:03.567215 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ccjvm"] Apr 24 21:18:03.570353 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:03.570327 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3a9aad5_ac45_4187_9590_f87a0a2157cd.slice/crio-af3cff244915f7f68a395cab78f1b01b0ded28ffa048b9fbc5904cdb34b54cce WatchSource:0}: Error finding container af3cff244915f7f68a395cab78f1b01b0ded28ffa048b9fbc5904cdb34b54cce: Status 404 returned error can't find the container with id af3cff244915f7f68a395cab78f1b01b0ded28ffa048b9fbc5904cdb34b54cce Apr 24 21:18:04.051432 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:04.051391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ccjvm" event={"ID":"a3a9aad5-ac45-4187-9590-f87a0a2157cd","Type":"ContainerStarted","Data":"c1b67b12cd9e49aa4fe738d44bd1443e8849d22cfbd3a4ab100e864d88729637"} Apr 24 21:18:04.051432 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:04.051434 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ccjvm" event={"ID":"a3a9aad5-ac45-4187-9590-f87a0a2157cd","Type":"ContainerStarted","Data":"af3cff244915f7f68a395cab78f1b01b0ded28ffa048b9fbc5904cdb34b54cce"} Apr 24 21:18:04.053115 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:04.053089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" event={"ID":"22fc467d-f2cb-40ba-8129-3562ce16391d","Type":"ContainerStarted","Data":"821b6089d2b60615af52ca8624ff390650fa09a3f5027f733505d565dec1f214"} Apr 24 21:18:04.053442 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:04.053420 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:18:04.055061 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:04.054892 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-d76d9f5d-zjqpm" Apr 24 21:18:05.030675 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:05.030649 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6bsmb" Apr 24 21:18:05.059279 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:05.059247 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ccjvm" event={"ID":"a3a9aad5-ac45-4187-9590-f87a0a2157cd","Type":"ContainerStarted","Data":"021f3d6cacff3000580f5526c8d47c7ed093eb0291e95c9e43452b4926e570c2"} Apr 24 21:18:05.060694 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:05.060669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" event={"ID":"22fc467d-f2cb-40ba-8129-3562ce16391d","Type":"ContainerStarted","Data":"2a03aa1dbed00fdb12c0310dc1a255f86702a0e674402a469ccdf0588c73af38"} Apr 24 21:18:05.078358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:05.078308 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5t8j7" podStartSLOduration=1.858947209 podStartE2EDuration="3.07829126s" podCreationTimestamp="2026-04-24 21:18:02 +0000 UTC" firstStartedPulling="2026-04-24 21:18:03.465088758 +0000 UTC m=+107.432890798" lastFinishedPulling="2026-04-24 21:18:04.684432805 +0000 UTC m=+108.652234849" observedRunningTime="2026-04-24 21:18:05.077528147 +0000 UTC m=+109.045330208" watchObservedRunningTime="2026-04-24 21:18:05.07829126 +0000 UTC m=+109.046093320" Apr 24 21:18:06.065179 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:06.065145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ccjvm" event={"ID":"a3a9aad5-ac45-4187-9590-f87a0a2157cd","Type":"ContainerStarted","Data":"38d7fe921a6760137d9fe264d912f1c1138b4526dbd4acddf986cd2702560845"} Apr 24 21:18:06.086051 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:06.086012 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ccjvm" podStartSLOduration=0.854673357 podStartE2EDuration="3.085999357s" podCreationTimestamp="2026-04-24 21:18:03 +0000 UTC" firstStartedPulling="2026-04-24 21:18:03.619159956 +0000 UTC m=+107.586961996" lastFinishedPulling="2026-04-24 21:18:05.850485943 +0000 UTC m=+109.818287996" observedRunningTime="2026-04-24 21:18:06.084952719 +0000 UTC m=+110.052754796" watchObservedRunningTime="2026-04-24 21:18:06.085999357 +0000 UTC m=+110.053801418" Apr 24 21:18:10.533486 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.533452 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8"] Apr 24 21:18:10.536171 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.536154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.540226 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.540203 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:18:10.540337 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.540207 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:18:10.540337 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.540251 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:18:10.540455 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.540266 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:18:10.541548 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.541524 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:18:10.541714 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.541695 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-6dblw\"" Apr 24 21:18:10.547987 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.547967 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8"] Apr 24 21:18:10.565092 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.565066 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zqzx4"] Apr 24 21:18:10.567066 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.567048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.567454 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.567429 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ntgvc"] Apr 24 21:18:10.569355 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.569339 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.569834 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.569818 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:18:10.569902 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.569882 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-rrpd8\"" Apr 24 21:18:10.570857 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.570838 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:18:10.571022 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.570999 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:18:10.574304 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.574281 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:18:10.574529 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.574503 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f2sfg\"" Apr 24 21:18:10.574529 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.574516 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:18:10.574672 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.574661 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:18:10.581802 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.581780 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zqzx4"] Apr 24 21:18:10.636018 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.635991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.636162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-accelerators-collector-config\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mxrb\" (UniqueName: \"kubernetes.io/projected/e6a7cb9f-8906-4db9-a2f9-ae926946111a-kube-api-access-5mxrb\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.636162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a2d6c6-7f94-4021-965e-83df5466f932-metrics-client-ca\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a173019-029b-4950-854e-8165aa0b2dd9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.636162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636150 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxfh8\" (UniqueName: \"kubernetes.io/projected/0a173019-029b-4950-854e-8165aa0b2dd9-kube-api-access-wxfh8\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.636439 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-wtmp\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636439 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-sys\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636439 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-root\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636439 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc84z\" (UniqueName: \"kubernetes.io/projected/e4a2d6c6-7f94-4021-965e-83df5466f932-kube-api-access-pc84z\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636439 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636395 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0a173019-029b-4950-854e-8165aa0b2dd9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.636439 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.636664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6a7cb9f-8906-4db9-a2f9-ae926946111a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.636664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-tls\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.636664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.636664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-textfile\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.636664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.636581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.737814 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.737782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-accelerators-collector-config\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.737831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mxrb\" (UniqueName: \"kubernetes.io/projected/e6a7cb9f-8906-4db9-a2f9-ae926946111a-kube-api-access-5mxrb\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.738006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.737852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a2d6c6-7f94-4021-965e-83df5466f932-metrics-client-ca\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.737872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a173019-029b-4950-854e-8165aa0b2dd9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.738006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.737892 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxfh8\" (UniqueName: \"kubernetes.io/projected/0a173019-029b-4950-854e-8165aa0b2dd9-kube-api-access-wxfh8\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.738006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.737914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-wtmp\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.737939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-sys\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.737991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-sys\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-root\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc84z\" (UniqueName: \"kubernetes.io/projected/e4a2d6c6-7f94-4021-965e-83df5466f932-kube-api-access-pc84z\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738092 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0a173019-029b-4950-854e-8165aa0b2dd9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.738336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-root\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.738336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6a7cb9f-8906-4db9-a2f9-ae926946111a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.738336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-tls\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738361 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-textfile\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0a173019-029b-4950-854e-8165aa0b2dd9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-wtmp\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:18:10.738515 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a2d6c6-7f94-4021-965e-83df5466f932-metrics-client-ca\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.738561 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-accelerators-collector-config\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.738721 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:18:10.738612 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-tls podName:e4a2d6c6-7f94-4021-965e-83df5466f932 nodeName:}" failed. No retries permitted until 2026-04-24 21:18:11.238592942 +0000 UTC m=+115.206394983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-tls") pod "node-exporter-ntgvc" (UID: "e4a2d6c6-7f94-4021-965e-83df5466f932") : secret "node-exporter-tls" not found Apr 24 21:18:10.739438 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.739045 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-textfile\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.739438 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:18:10.739128 2578 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 21:18:10.739438 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:18:10.739192 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-tls podName:e6a7cb9f-8906-4db9-a2f9-ae926946111a nodeName:}" failed. No retries permitted until 2026-04-24 21:18:11.239174771 +0000 UTC m=+115.206976812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-k2qk8" (UID: "e6a7cb9f-8906-4db9-a2f9-ae926946111a") : secret "openshift-state-metrics-tls" not found Apr 24 21:18:10.739601 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.739505 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6a7cb9f-8906-4db9-a2f9-ae926946111a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.739601 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.739569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a173019-029b-4950-854e-8165aa0b2dd9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.739705 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.739636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.740902 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.740876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.741090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.741073 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.741153 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.741092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.741153 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.741101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a173019-029b-4950-854e-8165aa0b2dd9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.746945 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.746920 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc84z\" (UniqueName: \"kubernetes.io/projected/e4a2d6c6-7f94-4021-965e-83df5466f932-kube-api-access-pc84z\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:10.748020 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.747984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mxrb\" (UniqueName: \"kubernetes.io/projected/e6a7cb9f-8906-4db9-a2f9-ae926946111a-kube-api-access-5mxrb\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:10.748263 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.748239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxfh8\" (UniqueName: \"kubernetes.io/projected/0a173019-029b-4950-854e-8165aa0b2dd9-kube-api-access-wxfh8\") pod \"kube-state-metrics-69db897b98-zqzx4\" (UID: \"0a173019-029b-4950-854e-8165aa0b2dd9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.880592 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.880522 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" Apr 24 21:18:10.996520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:10.996492 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zqzx4"] Apr 24 21:18:10.999584 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:10.999556 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a173019_029b_4950_854e_8165aa0b2dd9.slice/crio-a7838ad4a4c63e441430ab2a064b403b4c9938cb18f2db8e2ecc5f954bd1a000 WatchSource:0}: Error finding container a7838ad4a4c63e441430ab2a064b403b4c9938cb18f2db8e2ecc5f954bd1a000: Status 404 returned error can't find the container with id a7838ad4a4c63e441430ab2a064b403b4c9938cb18f2db8e2ecc5f954bd1a000 Apr 24 21:18:11.077882 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.077851 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" event={"ID":"0a173019-029b-4950-854e-8165aa0b2dd9","Type":"ContainerStarted","Data":"a7838ad4a4c63e441430ab2a064b403b4c9938cb18f2db8e2ecc5f954bd1a000"} Apr 24 21:18:11.243476 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.243430 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:11.243639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.243539 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-tls\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:11.245893 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.245869 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e4a2d6c6-7f94-4021-965e-83df5466f932-node-exporter-tls\") pod \"node-exporter-ntgvc\" (UID: \"e4a2d6c6-7f94-4021-965e-83df5466f932\") " pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:11.245992 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.245926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6a7cb9f-8906-4db9-a2f9-ae926946111a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k2qk8\" (UID: \"e6a7cb9f-8906-4db9-a2f9-ae926946111a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:11.445475 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.445429 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" Apr 24 21:18:11.487689 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.487645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ntgvc" Apr 24 21:18:11.496625 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:11.496589 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a2d6c6_7f94_4021_965e_83df5466f932.slice/crio-0075d9f82d84f490df565e586141b752724c36b37076894f10a3af3dcffa509b WatchSource:0}: Error finding container 0075d9f82d84f490df565e586141b752724c36b37076894f10a3af3dcffa509b: Status 404 returned error can't find the container with id 0075d9f82d84f490df565e586141b752724c36b37076894f10a3af3dcffa509b Apr 24 21:18:11.572707 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.572670 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8"] Apr 24 21:18:11.578402 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:11.578370 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6a7cb9f_8906_4db9_a2f9_ae926946111a.slice/crio-aabcabd0b250b4075e5987ffb712e9b08e00ee264237a03db7ec966b423a7976 WatchSource:0}: Error finding container aabcabd0b250b4075e5987ffb712e9b08e00ee264237a03db7ec966b423a7976: Status 404 returned error can't find the container with id aabcabd0b250b4075e5987ffb712e9b08e00ee264237a03db7ec966b423a7976 Apr 24 21:18:11.629454 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.629427 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:18:11.632299 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.632278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.635005 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.634869 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:18:11.635005 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.634881 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:18:11.635005 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.634921 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:18:11.635294 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.635275 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-qn8sh\"" Apr 24 21:18:11.635481 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.635466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:18:11.635662 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.635647 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:18:11.635997 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.635979 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:18:11.636092 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.636000 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:18:11.636151 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.636100 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:18:11.636209 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.636149 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:18:11.650033 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.649989 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:18:11.746308 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746308 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746281 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-web-config\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746310 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-volume\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746374 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746462 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-out\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47n8t\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-kube-api-access-47n8t\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746910 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746910 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.746910 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.746572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847422 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-web-config\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-volume\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-out\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47n8t\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-kube-api-access-47n8t\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847670 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.847966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.847764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.850673 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:18:11.849566 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle podName:fe7f1595-026d-48d1-845e-1fcf8bc412ef nodeName:}" failed. No retries permitted until 2026-04-24 21:18:12.349544211 +0000 UTC m=+116.317346271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef") : configmap references non-existent config key: ca-bundle.crt Apr 24 21:18:11.850673 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.850374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.851057 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.850895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.851057 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.850986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.852000 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.851940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.852000 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.851959 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.852435 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.852388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.853091 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.852681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-volume\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.853679 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.853652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.853977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.853939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.853977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.853950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-web-config\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.854527 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.854485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-out\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:11.857476 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:11.857457 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47n8t\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-kube-api-access-47n8t\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:12.084089 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:12.083994 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ntgvc" event={"ID":"e4a2d6c6-7f94-4021-965e-83df5466f932","Type":"ContainerStarted","Data":"0075d9f82d84f490df565e586141b752724c36b37076894f10a3af3dcffa509b"} Apr 24 21:18:12.085731 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:12.085704 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" event={"ID":"e6a7cb9f-8906-4db9-a2f9-ae926946111a","Type":"ContainerStarted","Data":"fa41ed5567fdff2fa748fa0636001088fe23067d7be07716990bab361c9298a3"} Apr 24 21:18:12.085731 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:12.085734 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" event={"ID":"e6a7cb9f-8906-4db9-a2f9-ae926946111a","Type":"ContainerStarted","Data":"3ec91fac5c3a014901b1a514aabaf925a534fe16597e48d6db2de54ee8de7b09"} Apr 24 21:18:12.085898 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:12.085756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" event={"ID":"e6a7cb9f-8906-4db9-a2f9-ae926946111a","Type":"ContainerStarted","Data":"aabcabd0b250b4075e5987ffb712e9b08e00ee264237a03db7ec966b423a7976"} Apr 24 21:18:12.352177 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:12.352149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:12.353143 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:12.353084 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:12.543916 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:12.543842 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:18:12.699697 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:12.699664 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:18:12.704807 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:12.704778 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe7f1595_026d_48d1_845e_1fcf8bc412ef.slice/crio-74275d07d88932b84928f3d27ae52462cf0e231bff10f32a02013527cc984cde WatchSource:0}: Error finding container 74275d07d88932b84928f3d27ae52462cf0e231bff10f32a02013527cc984cde: Status 404 returned error can't find the container with id 74275d07d88932b84928f3d27ae52462cf0e231bff10f32a02013527cc984cde Apr 24 21:18:13.080794 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.080774 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:18:13.090835 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.090805 2578 generic.go:358] "Generic (PLEG): container finished" podID="e4a2d6c6-7f94-4021-965e-83df5466f932" containerID="5c12407c74d383af8e9f4d077a743bcac94358743b9c878beeb9886e03646aee" exitCode=0 Apr 24 21:18:13.090937 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.090879 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ntgvc" event={"ID":"e4a2d6c6-7f94-4021-965e-83df5466f932","Type":"ContainerDied","Data":"5c12407c74d383af8e9f4d077a743bcac94358743b9c878beeb9886e03646aee"} Apr 24 21:18:13.094435 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.094402 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" event={"ID":"0a173019-029b-4950-854e-8165aa0b2dd9","Type":"ContainerStarted","Data":"21cedb4b3213f35a1baa22ba72d5664ff61500144aaebba5e3118286e073425c"} Apr 24 21:18:13.094523 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.094441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" event={"ID":"0a173019-029b-4950-854e-8165aa0b2dd9","Type":"ContainerStarted","Data":"1cd3d231ee02d195d45a1cf742ca6aad3bc410df45081627b735ab93726cf5fa"} Apr 24 21:18:13.094523 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.094456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" event={"ID":"0a173019-029b-4950-854e-8165aa0b2dd9","Type":"ContainerStarted","Data":"2a47265f47d47df4a41d73543ea615204ff2c9ac3f3846d81e64fefd4569a8d3"} Apr 24 21:18:13.095575 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.095543 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerStarted","Data":"74275d07d88932b84928f3d27ae52462cf0e231bff10f32a02013527cc984cde"} Apr 24 21:18:13.186855 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.186204 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" podStartSLOduration=1.843350826 podStartE2EDuration="3.186183752s" podCreationTimestamp="2026-04-24 21:18:10 +0000 UTC" firstStartedPulling="2026-04-24 21:18:11.687831883 +0000 UTC m=+115.655633930" lastFinishedPulling="2026-04-24 21:18:13.030664803 +0000 UTC m=+116.998466856" observedRunningTime="2026-04-24 21:18:13.157724245 +0000 UTC m=+117.125526307" watchObservedRunningTime="2026-04-24 21:18:13.186183752 +0000 UTC m=+117.153985813" Apr 24 21:18:13.186855 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:13.186338 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqzx4" podStartSLOduration=1.852014009 podStartE2EDuration="3.186331888s" podCreationTimestamp="2026-04-24 21:18:10 +0000 UTC" firstStartedPulling="2026-04-24 21:18:11.001482488 +0000 UTC m=+114.969284529" lastFinishedPulling="2026-04-24 21:18:12.335800354 +0000 UTC m=+116.303602408" observedRunningTime="2026-04-24 21:18:13.184924448 +0000 UTC m=+117.152726511" watchObservedRunningTime="2026-04-24 21:18:13.186331888 +0000 UTC m=+117.154133952" Apr 24 21:18:14.099932 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:14.099896 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerID="ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808" exitCode=0 Apr 24 21:18:14.100355 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:14.099981 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerDied","Data":"ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808"} Apr 24 21:18:14.101981 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:14.101947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ntgvc" event={"ID":"e4a2d6c6-7f94-4021-965e-83df5466f932","Type":"ContainerStarted","Data":"f1751fc4166dadc780a4d349062aaeccc8f09302d6f41837c6f0b1319ce123b1"} Apr 24 21:18:14.102082 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:14.101992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ntgvc" event={"ID":"e4a2d6c6-7f94-4021-965e-83df5466f932","Type":"ContainerStarted","Data":"fbfdf202f728814e728e0c026d136cc840411c994b40f3a4972b318c9c2386ff"} Apr 24 21:18:14.103740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:14.103718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2qk8" event={"ID":"e6a7cb9f-8906-4db9-a2f9-ae926946111a","Type":"ContainerStarted","Data":"b4ebefeab5fb424c300454824b53d83253bdcb0d486afa014b34f84c42c990c4"} Apr 24 21:18:14.153016 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:14.152961 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ntgvc" podStartSLOduration=3.317552828 podStartE2EDuration="4.152946668s" podCreationTimestamp="2026-04-24 21:18:10 +0000 UTC" firstStartedPulling="2026-04-24 21:18:11.501885576 +0000 UTC m=+115.469687616" lastFinishedPulling="2026-04-24 21:18:12.337279404 +0000 UTC m=+116.305081456" observedRunningTime="2026-04-24 21:18:14.151875264 +0000 UTC m=+118.119677341" watchObservedRunningTime="2026-04-24 21:18:14.152946668 +0000 UTC m=+118.120748765" Apr 24 21:18:15.363030 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:15.363009 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t"] Apr 24 21:18:15.365785 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:15.365739 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" Apr 24 21:18:15.368412 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:15.368394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-skgjh\"" Apr 24 21:18:15.368847 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:15.368819 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:18:15.382939 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:15.382916 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t"] Apr 24 21:18:15.482808 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:15.482768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7f6ca405-4c66-464f-ab4b-de5695efac52-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6bf8t\" (UID: \"7f6ca405-4c66-464f-ab4b-de5695efac52\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" Apr 24 21:18:15.584098 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:15.584073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7f6ca405-4c66-464f-ab4b-de5695efac52-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6bf8t\" (UID: \"7f6ca405-4c66-464f-ab4b-de5695efac52\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" Apr 24 21:18:15.584247 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:18:15.584229 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 21:18:15.584292 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:18:15.584287 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f6ca405-4c66-464f-ab4b-de5695efac52-monitoring-plugin-cert podName:7f6ca405-4c66-464f-ab4b-de5695efac52 nodeName:}" failed. No retries permitted until 2026-04-24 21:18:16.084271584 +0000 UTC m=+120.052073626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/7f6ca405-4c66-464f-ab4b-de5695efac52-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-6bf8t" (UID: "7f6ca405-4c66-464f-ab4b-de5695efac52") : secret "monitoring-plugin-cert" not found Apr 24 21:18:16.088733 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.088688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7f6ca405-4c66-464f-ab4b-de5695efac52-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6bf8t\" (UID: \"7f6ca405-4c66-464f-ab4b-de5695efac52\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" Apr 24 21:18:16.091257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.091231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7f6ca405-4c66-464f-ab4b-de5695efac52-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6bf8t\" (UID: \"7f6ca405-4c66-464f-ab4b-de5695efac52\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" Apr 24 21:18:16.112087 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.112053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerStarted","Data":"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267"} Apr 24 21:18:16.112178 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.112091 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerStarted","Data":"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9"} Apr 24 21:18:16.112178 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.112105 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerStarted","Data":"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23"} Apr 24 21:18:16.112178 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.112116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerStarted","Data":"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316"} Apr 24 21:18:16.112178 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.112129 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerStarted","Data":"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9"} Apr 24 21:18:16.275721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.275687 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" Apr 24 21:18:16.453220 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:16.453192 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t"] Apr 24 21:18:16.456090 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:16.456065 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f6ca405_4c66_464f_ab4b_de5695efac52.slice/crio-0d8cd9c63f52d3556c6eefd18e64141fe1236b993a5453b129f6cc789ef60b59 WatchSource:0}: Error finding container 0d8cd9c63f52d3556c6eefd18e64141fe1236b993a5453b129f6cc789ef60b59: Status 404 returned error can't find the container with id 0d8cd9c63f52d3556c6eefd18e64141fe1236b993a5453b129f6cc789ef60b59 Apr 24 21:18:17.116079 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:17.116033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" event={"ID":"7f6ca405-4c66-464f-ab4b-de5695efac52","Type":"ContainerStarted","Data":"0d8cd9c63f52d3556c6eefd18e64141fe1236b993a5453b129f6cc789ef60b59"} Apr 24 21:18:17.119182 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:17.119153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerStarted","Data":"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a"} Apr 24 21:18:17.150919 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:17.150868 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.5006651619999998 podStartE2EDuration="6.150851372s" podCreationTimestamp="2026-04-24 21:18:11 +0000 UTC" firstStartedPulling="2026-04-24 21:18:12.707207956 +0000 UTC m=+116.675010008" lastFinishedPulling="2026-04-24 21:18:16.357394178 +0000 UTC m=+120.325196218" observedRunningTime="2026-04-24 21:18:17.149411202 +0000 UTC m=+121.117213266" watchObservedRunningTime="2026-04-24 21:18:17.150851372 +0000 UTC m=+121.118653434" Apr 24 21:18:18.123204 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:18.123111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" event={"ID":"7f6ca405-4c66-464f-ab4b-de5695efac52","Type":"ContainerStarted","Data":"c5b4a1f1cc86ae6c0620384eeefe5a8b1807be90fbc5644b59201b0906e602f9"} Apr 24 21:18:18.123600 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:18.123405 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" Apr 24 21:18:18.128477 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:18.128459 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" Apr 24 21:18:18.138045 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:18.138001 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6bf8t" podStartSLOduration=1.938181709 podStartE2EDuration="3.137989492s" podCreationTimestamp="2026-04-24 21:18:15 +0000 UTC" firstStartedPulling="2026-04-24 21:18:16.457917843 +0000 UTC m=+120.425719897" lastFinishedPulling="2026-04-24 21:18:17.657725639 +0000 UTC m=+121.625527680" observedRunningTime="2026-04-24 21:18:18.136689083 +0000 UTC m=+122.104491143" watchObservedRunningTime="2026-04-24 21:18:18.137989492 +0000 UTC m=+122.105791553" Apr 24 21:18:25.361055 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.361019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:18:25.363393 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.363372 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371c1fec-a68a-4ff5-b5fc-29a34feb3ffe-metrics-certs\") pod \"network-metrics-daemon-bcqjb\" (UID: \"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe\") " pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:18:25.579222 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.579193 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2rwk\"" Apr 24 21:18:25.586066 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.586053 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcqjb" Apr 24 21:18:25.693727 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.693699 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-bh9lr"] Apr 24 21:18:25.700260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.700241 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-bh9lr" Apr 24 21:18:25.702999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.702982 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-sckr4\"" Apr 24 21:18:25.703678 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.703665 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:18:25.703912 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.703894 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:18:25.709732 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.709710 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-bh9lr"] Apr 24 21:18:25.710964 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.710945 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bcqjb"] Apr 24 21:18:25.715824 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:25.715786 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod371c1fec_a68a_4ff5_b5fc_29a34feb3ffe.slice/crio-e39ac0dbae41902b60616a1ac14c8728ae6b62ab163c12f10b78db4d1fd4e355 WatchSource:0}: Error finding container e39ac0dbae41902b60616a1ac14c8728ae6b62ab163c12f10b78db4d1fd4e355: Status 404 returned error can't find the container with id e39ac0dbae41902b60616a1ac14c8728ae6b62ab163c12f10b78db4d1fd4e355 Apr 24 21:18:25.763676 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.763648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvxfp\" (UniqueName: \"kubernetes.io/projected/bda5e38a-75b7-4355-b330-717228aa7a75-kube-api-access-zvxfp\") pod \"downloads-6bcc868b7-bh9lr\" (UID: \"bda5e38a-75b7-4355-b330-717228aa7a75\") " pod="openshift-console/downloads-6bcc868b7-bh9lr" Apr 24 21:18:25.864887 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.864861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvxfp\" (UniqueName: \"kubernetes.io/projected/bda5e38a-75b7-4355-b330-717228aa7a75-kube-api-access-zvxfp\") pod \"downloads-6bcc868b7-bh9lr\" (UID: \"bda5e38a-75b7-4355-b330-717228aa7a75\") " pod="openshift-console/downloads-6bcc868b7-bh9lr" Apr 24 21:18:25.872977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:25.872958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvxfp\" (UniqueName: \"kubernetes.io/projected/bda5e38a-75b7-4355-b330-717228aa7a75-kube-api-access-zvxfp\") pod \"downloads-6bcc868b7-bh9lr\" (UID: \"bda5e38a-75b7-4355-b330-717228aa7a75\") " pod="openshift-console/downloads-6bcc868b7-bh9lr" Apr 24 21:18:26.010027 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:26.009992 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-bh9lr" Apr 24 21:18:26.124433 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:26.124347 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-bh9lr"] Apr 24 21:18:26.127041 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:26.127010 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda5e38a_75b7_4355_b330_717228aa7a75.slice/crio-1f72cd84848c7631ac77493948a4eb3b10b2893b09ea6c39ee32b7a20db0fce7 WatchSource:0}: Error finding container 1f72cd84848c7631ac77493948a4eb3b10b2893b09ea6c39ee32b7a20db0fce7: Status 404 returned error can't find the container with id 1f72cd84848c7631ac77493948a4eb3b10b2893b09ea6c39ee32b7a20db0fce7 Apr 24 21:18:26.146727 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:26.146699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-bh9lr" event={"ID":"bda5e38a-75b7-4355-b330-717228aa7a75","Type":"ContainerStarted","Data":"1f72cd84848c7631ac77493948a4eb3b10b2893b09ea6c39ee32b7a20db0fce7"} Apr 24 21:18:26.147918 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:26.147882 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcqjb" event={"ID":"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe","Type":"ContainerStarted","Data":"e39ac0dbae41902b60616a1ac14c8728ae6b62ab163c12f10b78db4d1fd4e355"} Apr 24 21:18:27.153189 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:27.153139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcqjb" event={"ID":"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe","Type":"ContainerStarted","Data":"8d003ffc9e3ffed8f46e211f6efe4d0793e84649d013e25784ec3753de4f6192"} Apr 24 21:18:27.153189 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:27.153189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcqjb" event={"ID":"371c1fec-a68a-4ff5-b5fc-29a34feb3ffe","Type":"ContainerStarted","Data":"cb7f8579380b9a2e95f4d10b33f233a341c30e58b99524cb7c8fb996c0a77800"} Apr 24 21:18:27.171122 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:27.171067 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bcqjb" podStartSLOduration=129.210618654 podStartE2EDuration="2m10.171054565s" podCreationTimestamp="2026-04-24 21:16:17 +0000 UTC" firstStartedPulling="2026-04-24 21:18:25.718523828 +0000 UTC m=+129.686325870" lastFinishedPulling="2026-04-24 21:18:26.678959735 +0000 UTC m=+130.646761781" observedRunningTime="2026-04-24 21:18:27.169031875 +0000 UTC m=+131.136833936" watchObservedRunningTime="2026-04-24 21:18:27.171054565 +0000 UTC m=+131.138856627" Apr 24 21:18:28.095810 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.095731 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-bd77785df-clsxr" podUID="d2f78a7f-b7ec-4945-937f-606241594124" containerName="registry" containerID="cri-o://68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da" gracePeriod=30 Apr 24 21:18:28.339691 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.339657 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:18:28.486517 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.486486 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2f78a7f-b7ec-4945-937f-606241594124-ca-trust-extracted\") pod \"d2f78a7f-b7ec-4945-937f-606241594124\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " Apr 24 21:18:28.486685 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.486535 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-image-registry-private-configuration\") pod \"d2f78a7f-b7ec-4945-937f-606241594124\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " Apr 24 21:18:28.486685 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.486565 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") pod \"d2f78a7f-b7ec-4945-937f-606241594124\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " Apr 24 21:18:28.486685 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.486601 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-trusted-ca\") pod \"d2f78a7f-b7ec-4945-937f-606241594124\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " Apr 24 21:18:28.486685 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.486624 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-bound-sa-token\") pod \"d2f78a7f-b7ec-4945-937f-606241594124\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " Apr 24 21:18:28.486685 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.486645 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-registry-certificates\") pod \"d2f78a7f-b7ec-4945-937f-606241594124\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " Apr 24 21:18:28.486932 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.486688 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-installation-pull-secrets\") pod \"d2f78a7f-b7ec-4945-937f-606241594124\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " Apr 24 21:18:28.486932 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.486725 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kllxj\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-kube-api-access-kllxj\") pod \"d2f78a7f-b7ec-4945-937f-606241594124\" (UID: \"d2f78a7f-b7ec-4945-937f-606241594124\") " Apr 24 21:18:28.487431 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.487399 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d2f78a7f-b7ec-4945-937f-606241594124" (UID: "d2f78a7f-b7ec-4945-937f-606241594124"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:28.488134 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.488085 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d2f78a7f-b7ec-4945-937f-606241594124" (UID: "d2f78a7f-b7ec-4945-937f-606241594124"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:28.489638 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.489596 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d2f78a7f-b7ec-4945-937f-606241594124" (UID: "d2f78a7f-b7ec-4945-937f-606241594124"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:28.490021 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.489991 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-kube-api-access-kllxj" (OuterVolumeSpecName: "kube-api-access-kllxj") pod "d2f78a7f-b7ec-4945-937f-606241594124" (UID: "d2f78a7f-b7ec-4945-937f-606241594124"). InnerVolumeSpecName "kube-api-access-kllxj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:28.490100 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.490061 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d2f78a7f-b7ec-4945-937f-606241594124" (UID: "d2f78a7f-b7ec-4945-937f-606241594124"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:28.490156 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.490140 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d2f78a7f-b7ec-4945-937f-606241594124" (UID: "d2f78a7f-b7ec-4945-937f-606241594124"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:28.490909 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.490886 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d2f78a7f-b7ec-4945-937f-606241594124" (UID: "d2f78a7f-b7ec-4945-937f-606241594124"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:28.496009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.495988 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f78a7f-b7ec-4945-937f-606241594124-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d2f78a7f-b7ec-4945-937f-606241594124" (UID: "d2f78a7f-b7ec-4945-937f-606241594124"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:18:28.588253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.588229 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2f78a7f-b7ec-4945-937f-606241594124-ca-trust-extracted\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:18:28.588253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.588253 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-image-registry-private-configuration\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:18:28.588381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.588264 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-registry-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:18:28.588381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.588280 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-trusted-ca\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:18:28.588381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.588289 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-bound-sa-token\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:18:28.588381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.588297 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2f78a7f-b7ec-4945-937f-606241594124-registry-certificates\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:18:28.588381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.588305 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2f78a7f-b7ec-4945-937f-606241594124-installation-pull-secrets\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:18:28.588381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:28.588314 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kllxj\" (UniqueName: \"kubernetes.io/projected/d2f78a7f-b7ec-4945-937f-606241594124-kube-api-access-kllxj\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:18:29.160509 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.160467 2578 generic.go:358] "Generic (PLEG): container finished" podID="d2f78a7f-b7ec-4945-937f-606241594124" containerID="68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da" exitCode=0 Apr 24 21:18:29.160666 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.160532 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bd77785df-clsxr" event={"ID":"d2f78a7f-b7ec-4945-937f-606241594124","Type":"ContainerDied","Data":"68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da"} Apr 24 21:18:29.160666 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.160539 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bd77785df-clsxr" Apr 24 21:18:29.160666 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.160562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bd77785df-clsxr" event={"ID":"d2f78a7f-b7ec-4945-937f-606241594124","Type":"ContainerDied","Data":"680542c3f9b84d77ee04aae5d29040e76196c0ea8873c485d86c64a95da0cb62"} Apr 24 21:18:29.160666 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.160583 2578 scope.go:117] "RemoveContainer" containerID="68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da" Apr 24 21:18:29.169977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.169957 2578 scope.go:117] "RemoveContainer" containerID="68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da" Apr 24 21:18:29.170255 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:18:29.170229 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da\": container with ID starting with 68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da not found: ID does not exist" containerID="68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da" Apr 24 21:18:29.170325 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.170268 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da"} err="failed to get container status \"68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da\": rpc error: code = NotFound desc = could not find container \"68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da\": container with ID starting with 68dcd924889952d6838a57167a5c473f6e573ce4cb80f450123b44cdc7f7d2da not found: ID does not exist" Apr 24 21:18:29.178656 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.178630 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bd77785df-clsxr"] Apr 24 21:18:29.182921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:29.182894 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-bd77785df-clsxr"] Apr 24 21:18:30.662589 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:30.662553 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f78a7f-b7ec-4945-937f-606241594124" path="/var/lib/kubelet/pods/d2f78a7f-b7ec-4945-937f-606241594124/volumes" Apr 24 21:18:36.214179 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.214148 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d8dccbbb-wsdkj"] Apr 24 21:18:36.214733 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.214570 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2f78a7f-b7ec-4945-937f-606241594124" containerName="registry" Apr 24 21:18:36.214733 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.214588 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f78a7f-b7ec-4945-937f-606241594124" containerName="registry" Apr 24 21:18:36.214733 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.214667 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2f78a7f-b7ec-4945-937f-606241594124" containerName="registry" Apr 24 21:18:36.216537 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.216515 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.220305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.220050 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:18:36.220305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.220067 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:18:36.220305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.220050 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:18:36.220305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.220158 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:18:36.220305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.220110 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:18:36.220679 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.220393 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-25kmc\"" Apr 24 21:18:36.226855 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.226834 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d8dccbbb-wsdkj"] Apr 24 21:18:36.349465 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.349431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-oauth-config\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.349639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.349498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-service-ca\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.349639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.349529 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-oauth-serving-cert\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.349639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.349554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znv2z\" (UniqueName: \"kubernetes.io/projected/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-kube-api-access-znv2z\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.349828 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.349646 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-serving-cert\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.349828 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.349685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-config\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.450723 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.450690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-oauth-config\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.450911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.450758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-service-ca\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.450911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.450779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-oauth-serving-cert\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.450911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.450796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znv2z\" (UniqueName: \"kubernetes.io/projected/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-kube-api-access-znv2z\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.450911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.450815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-serving-cert\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.450911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.450834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-config\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.451528 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.451500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-oauth-serving-cert\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.451623 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.451564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-config\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.451623 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.451585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-service-ca\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.453678 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.453655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-oauth-config\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.453868 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.453783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-serving-cert\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.458899 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.458866 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znv2z\" (UniqueName: \"kubernetes.io/projected/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-kube-api-access-znv2z\") pod \"console-7d8dccbbb-wsdkj\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:36.527613 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:36.527554 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:42.880117 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:42.880091 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d8dccbbb-wsdkj"] Apr 24 21:18:42.883204 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:42.883175 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f47a2a_86ee_4f10_bd8b_3f5db0e21bda.slice/crio-86b356f66734c5789890050e325dc31fe6ed4a22ec6f892baff51b109b69ead1 WatchSource:0}: Error finding container 86b356f66734c5789890050e325dc31fe6ed4a22ec6f892baff51b109b69ead1: Status 404 returned error can't find the container with id 86b356f66734c5789890050e325dc31fe6ed4a22ec6f892baff51b109b69ead1 Apr 24 21:18:43.205970 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:43.205929 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-bh9lr" event={"ID":"bda5e38a-75b7-4355-b330-717228aa7a75","Type":"ContainerStarted","Data":"bc0b2b08a9294fa83acedd6a9f3f690db7ab6e8d96e7d1bdcd3cf676e7630e44"} Apr 24 21:18:43.206173 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:43.206084 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-bh9lr" Apr 24 21:18:43.207309 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:43.207276 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d8dccbbb-wsdkj" event={"ID":"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda","Type":"ContainerStarted","Data":"86b356f66734c5789890050e325dc31fe6ed4a22ec6f892baff51b109b69ead1"} Apr 24 21:18:43.223483 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:43.223433 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-bh9lr" podStartSLOduration=1.526350904 podStartE2EDuration="18.223419917s" podCreationTimestamp="2026-04-24 21:18:25 +0000 UTC" firstStartedPulling="2026-04-24 21:18:26.129083799 +0000 UTC m=+130.096885839" lastFinishedPulling="2026-04-24 21:18:42.826152809 +0000 UTC m=+146.793954852" observedRunningTime="2026-04-24 21:18:43.221214524 +0000 UTC m=+147.189016588" watchObservedRunningTime="2026-04-24 21:18:43.223419917 +0000 UTC m=+147.191221978" Apr 24 21:18:43.224799 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:43.224777 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-bh9lr" Apr 24 21:18:44.986903 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:44.986817 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67f946669c-79tgc"] Apr 24 21:18:45.004077 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.004046 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67f946669c-79tgc"] Apr 24 21:18:45.004257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.004121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.014390 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.014365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:18:45.130293 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.130254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpdls\" (UniqueName: \"kubernetes.io/projected/5c925852-4ea8-4bb0-8044-486ad660258d-kube-api-access-bpdls\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.130504 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.130316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-serving-cert\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.130504 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.130398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-oauth-serving-cert\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.130639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.130504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-service-ca\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.130639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.130540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-oauth-config\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.130639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.130581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-console-config\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.130639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.130605 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-trusted-ca-bundle\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.231163 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.231126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-service-ca\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.231358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.231180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-oauth-config\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.231358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.231232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-console-config\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.231358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.231261 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-trusted-ca-bundle\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.231358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.231312 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpdls\" (UniqueName: \"kubernetes.io/projected/5c925852-4ea8-4bb0-8044-486ad660258d-kube-api-access-bpdls\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.231358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.231351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-serving-cert\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.232380 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.231382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-oauth-serving-cert\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.232380 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.232040 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-service-ca\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.232380 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.232079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-oauth-serving-cert\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.232662 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.232638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-trusted-ca-bundle\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.233245 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.233208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-console-config\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.235811 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.235789 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-serving-cert\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.235986 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.235816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-oauth-config\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.240392 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.240284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpdls\" (UniqueName: \"kubernetes.io/projected/5c925852-4ea8-4bb0-8044-486ad660258d-kube-api-access-bpdls\") pod \"console-67f946669c-79tgc\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:45.324139 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:45.324092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:46.135277 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:46.135020 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67f946669c-79tgc"] Apr 24 21:18:46.137843 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:18:46.137811 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c925852_4ea8_4bb0_8044_486ad660258d.slice/crio-8b4bad5b5680fd068e90b21cf1584e80b369a9d7df8faeb989f723cb30444788 WatchSource:0}: Error finding container 8b4bad5b5680fd068e90b21cf1584e80b369a9d7df8faeb989f723cb30444788: Status 404 returned error can't find the container with id 8b4bad5b5680fd068e90b21cf1584e80b369a9d7df8faeb989f723cb30444788 Apr 24 21:18:46.222470 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:46.222413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67f946669c-79tgc" event={"ID":"5c925852-4ea8-4bb0-8044-486ad660258d","Type":"ContainerStarted","Data":"8b4bad5b5680fd068e90b21cf1584e80b369a9d7df8faeb989f723cb30444788"} Apr 24 21:18:47.227616 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:47.227568 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d8dccbbb-wsdkj" event={"ID":"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda","Type":"ContainerStarted","Data":"2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544"} Apr 24 21:18:47.229782 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:47.229729 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67f946669c-79tgc" event={"ID":"5c925852-4ea8-4bb0-8044-486ad660258d","Type":"ContainerStarted","Data":"11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d"} Apr 24 21:18:47.246006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:47.245963 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d8dccbbb-wsdkj" podStartSLOduration=7.685950652 podStartE2EDuration="11.245948638s" podCreationTimestamp="2026-04-24 21:18:36 +0000 UTC" firstStartedPulling="2026-04-24 21:18:42.885129724 +0000 UTC m=+146.852931764" lastFinishedPulling="2026-04-24 21:18:46.445127702 +0000 UTC m=+150.412929750" observedRunningTime="2026-04-24 21:18:47.243736538 +0000 UTC m=+151.211538601" watchObservedRunningTime="2026-04-24 21:18:47.245948638 +0000 UTC m=+151.213750702" Apr 24 21:18:47.261052 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:47.261010 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67f946669c-79tgc" podStartSLOduration=2.675920969 podStartE2EDuration="3.260994588s" podCreationTimestamp="2026-04-24 21:18:44 +0000 UTC" firstStartedPulling="2026-04-24 21:18:46.140051713 +0000 UTC m=+150.107853759" lastFinishedPulling="2026-04-24 21:18:46.725125318 +0000 UTC m=+150.692927378" observedRunningTime="2026-04-24 21:18:47.259561636 +0000 UTC m=+151.227363708" watchObservedRunningTime="2026-04-24 21:18:47.260994588 +0000 UTC m=+151.228796648" Apr 24 21:18:55.324646 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:55.324563 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:55.324646 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:55.324638 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:55.329182 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:55.329159 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:55.942136 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:55.942112 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fe7f1595-026d-48d1-845e-1fcf8bc412ef/init-config-reloader/0.log" Apr 24 21:18:55.949131 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:55.949110 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fe7f1595-026d-48d1-845e-1fcf8bc412ef/alertmanager/0.log" Apr 24 21:18:56.097780 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:56.097740 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fe7f1595-026d-48d1-845e-1fcf8bc412ef/config-reloader/0.log" Apr 24 21:18:56.273771 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:56.273690 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:18:56.297432 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:56.297402 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fe7f1595-026d-48d1-845e-1fcf8bc412ef/kube-rbac-proxy-web/0.log" Apr 24 21:18:56.320919 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:56.320892 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d8dccbbb-wsdkj"] Apr 24 21:18:56.497625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:56.497598 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fe7f1595-026d-48d1-845e-1fcf8bc412ef/kube-rbac-proxy/0.log" Apr 24 21:18:56.527773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:56.527699 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:18:56.705106 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:56.705083 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fe7f1595-026d-48d1-845e-1fcf8bc412ef/kube-rbac-proxy-metric/0.log" Apr 24 21:18:56.897619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:56.897595 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fe7f1595-026d-48d1-845e-1fcf8bc412ef/prom-label-proxy/0.log" Apr 24 21:18:57.297515 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:57.297439 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqzx4_0a173019-029b-4950-854e-8165aa0b2dd9/kube-state-metrics/0.log" Apr 24 21:18:57.497192 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:57.497166 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqzx4_0a173019-029b-4950-854e-8165aa0b2dd9/kube-rbac-proxy-main/0.log" Apr 24 21:18:57.697362 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:57.697335 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqzx4_0a173019-029b-4950-854e-8165aa0b2dd9/kube-rbac-proxy-self/0.log" Apr 24 21:18:58.097232 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:58.097160 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-6bf8t_7f6ca405-4c66-464f-ab4b-de5695efac52/monitoring-plugin/0.log" Apr 24 21:18:58.897900 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:58.897869 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntgvc_e4a2d6c6-7f94-4021-965e-83df5466f932/init-textfile/0.log" Apr 24 21:18:59.097813 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:59.097787 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntgvc_e4a2d6c6-7f94-4021-965e-83df5466f932/node-exporter/0.log" Apr 24 21:18:59.297551 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:18:59.297486 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntgvc_e4a2d6c6-7f94-4021-965e-83df5466f932/kube-rbac-proxy/0.log" Apr 24 21:19:00.097398 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:00.097370 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2qk8_e6a7cb9f-8906-4db9-a2f9-ae926946111a/kube-rbac-proxy-main/0.log" Apr 24 21:19:00.282398 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:00.282371 2578 generic.go:358] "Generic (PLEG): container finished" podID="80054b0a-2a30-40a5-87a8-568c2346c169" containerID="30427c3f64fc0b3f3ebdb1f004363c6cb6be5acd2136578763d3212fd9261b5b" exitCode=0 Apr 24 21:19:00.282536 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:00.282411 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" event={"ID":"80054b0a-2a30-40a5-87a8-568c2346c169","Type":"ContainerDied","Data":"30427c3f64fc0b3f3ebdb1f004363c6cb6be5acd2136578763d3212fd9261b5b"} Apr 24 21:19:00.282682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:00.282669 2578 scope.go:117] "RemoveContainer" containerID="30427c3f64fc0b3f3ebdb1f004363c6cb6be5acd2136578763d3212fd9261b5b" Apr 24 21:19:00.298796 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:00.298771 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2qk8_e6a7cb9f-8906-4db9-a2f9-ae926946111a/kube-rbac-proxy-self/0.log" Apr 24 21:19:00.497848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:00.497822 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2qk8_e6a7cb9f-8906-4db9-a2f9-ae926946111a/openshift-state-metrics/0.log" Apr 24 21:19:01.286235 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:01.286197 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pmmqd" event={"ID":"80054b0a-2a30-40a5-87a8-568c2346c169","Type":"ContainerStarted","Data":"eafe637386e926ed5bdace29f652448f6a8a954c13dcd446af0cdd764357edf4"} Apr 24 21:19:03.898068 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:03.898040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-5t8j7_22fc467d-f2cb-40ba-8129-3562ce16391d/networking-console-plugin/0.log" Apr 24 21:19:04.500206 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:04.500181 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67f946669c-79tgc_5c925852-4ea8-4bb0-8044-486ad660258d/console/0.log" Apr 24 21:19:04.697796 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:04.697772 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d8dccbbb-wsdkj_e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda/console/0.log" Apr 24 21:19:04.898683 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:04.898663 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-bh9lr_bda5e38a-75b7-4355-b330-717228aa7a75/download-server/0.log" Apr 24 21:19:21.339909 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.339850 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d8dccbbb-wsdkj" podUID="e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" containerName="console" containerID="cri-o://2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544" gracePeriod=15 Apr 24 21:19:21.588523 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.588501 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d8dccbbb-wsdkj_e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda/console/0.log" Apr 24 21:19:21.588637 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.588569 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:19:21.685365 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685342 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-oauth-config\") pod \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " Apr 24 21:19:21.685484 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685371 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-serving-cert\") pod \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " Apr 24 21:19:21.685484 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685392 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znv2z\" (UniqueName: \"kubernetes.io/projected/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-kube-api-access-znv2z\") pod \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " Apr 24 21:19:21.685484 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685412 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-config\") pod \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " Apr 24 21:19:21.685484 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685446 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-oauth-serving-cert\") pod \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " Apr 24 21:19:21.685484 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685471 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-service-ca\") pod \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\" (UID: \"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda\") " Apr 24 21:19:21.685857 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685829 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-config" (OuterVolumeSpecName: "console-config") pod "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" (UID: "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:21.685968 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685877 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" (UID: "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:21.686197 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.685998 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-service-ca" (OuterVolumeSpecName: "service-ca") pod "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" (UID: "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:21.687653 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.687624 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" (UID: "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:21.687800 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.687682 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" (UID: "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:21.687800 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.687710 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-kube-api-access-znv2z" (OuterVolumeSpecName: "kube-api-access-znv2z") pod "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" (UID: "e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda"). InnerVolumeSpecName "kube-api-access-znv2z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:21.786336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.786314 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-oauth-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:21.786336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.786334 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-serving-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:21.786443 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.786343 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znv2z\" (UniqueName: \"kubernetes.io/projected/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-kube-api-access-znv2z\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:21.786443 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.786353 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-console-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:21.786443 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.786362 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-oauth-serving-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:21.786443 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:21.786370 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda-service-ca\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:22.348078 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.348048 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d8dccbbb-wsdkj_e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda/console/0.log" Apr 24 21:19:22.348565 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.348091 2578 generic.go:358] "Generic (PLEG): container finished" podID="e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" containerID="2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544" exitCode=2 Apr 24 21:19:22.348565 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.348125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d8dccbbb-wsdkj" event={"ID":"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda","Type":"ContainerDied","Data":"2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544"} Apr 24 21:19:22.348565 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.348155 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d8dccbbb-wsdkj" Apr 24 21:19:22.348565 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.348165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d8dccbbb-wsdkj" event={"ID":"e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda","Type":"ContainerDied","Data":"86b356f66734c5789890050e325dc31fe6ed4a22ec6f892baff51b109b69ead1"} Apr 24 21:19:22.348565 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.348183 2578 scope.go:117] "RemoveContainer" containerID="2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544" Apr 24 21:19:22.356549 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.356529 2578 scope.go:117] "RemoveContainer" containerID="2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544" Apr 24 21:19:22.356803 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:19:22.356781 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544\": container with ID starting with 2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544 not found: ID does not exist" containerID="2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544" Apr 24 21:19:22.356899 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.356809 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544"} err="failed to get container status \"2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544\": rpc error: code = NotFound desc = could not find container \"2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544\": container with ID starting with 2c02d6ba7e2ab07dc5124551d0363de2b2a09952524dccb76c0dbd4e57840544 not found: ID does not exist" Apr 24 21:19:22.368960 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.368938 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d8dccbbb-wsdkj"] Apr 24 21:19:22.372152 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.372132 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d8dccbbb-wsdkj"] Apr 24 21:19:22.659076 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:22.659054 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" path="/var/lib/kubelet/pods/e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda/volumes" Apr 24 21:19:30.939081 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:30.939048 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:30.939543 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:30.939510 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy" containerID="cri-o://f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9" gracePeriod=120 Apr 24 21:19:30.939740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:30.939494 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="alertmanager" containerID="cri-o://747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9" gracePeriod=120 Apr 24 21:19:30.939871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:30.939530 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy-web" containerID="cri-o://6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23" gracePeriod=120 Apr 24 21:19:30.939871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:30.939554 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="prom-label-proxy" containerID="cri-o://c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a" gracePeriod=120 Apr 24 21:19:30.939871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:30.939567 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy-metric" containerID="cri-o://184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267" gracePeriod=120 Apr 24 21:19:30.940029 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:30.939577 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="config-reloader" containerID="cri-o://6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316" gracePeriod=120 Apr 24 21:19:31.376681 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:31.376601 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerID="c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a" exitCode=0 Apr 24 21:19:31.376681 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:31.376631 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerID="f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9" exitCode=0 Apr 24 21:19:31.376681 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:31.376639 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerID="6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316" exitCode=0 Apr 24 21:19:31.376681 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:31.376645 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerID="747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9" exitCode=0 Apr 24 21:19:31.376914 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:31.376672 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerDied","Data":"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a"} Apr 24 21:19:31.376914 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:31.376715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerDied","Data":"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9"} Apr 24 21:19:31.376914 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:31.376725 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerDied","Data":"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316"} Apr 24 21:19:31.376914 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:31.376735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerDied","Data":"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9"} Apr 24 21:19:32.177713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.177684 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.267040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.266965 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-metrics-client-ca\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267002 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-web-config\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267023 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-cluster-tls-config\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267273 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267065 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-tls-assets\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267273 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267096 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267273 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267119 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-web\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267273 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267149 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-out\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267273 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267176 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-main-tls\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267273 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267211 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-main-db\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267273 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267235 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267572 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267277 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-metric\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267572 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267302 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-volume\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267572 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267341 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47n8t\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-kube-api-access-47n8t\") pod \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\" (UID: \"fe7f1595-026d-48d1-845e-1fcf8bc412ef\") " Apr 24 21:19:32.267572 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267340 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:32.267572 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.267543 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-metrics-client-ca\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.268722 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.268669 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:32.269007 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.268968 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:19:32.270461 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.270409 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:32.270610 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.270578 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:32.270836 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.270813 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-kube-api-access-47n8t" (OuterVolumeSpecName: "kube-api-access-47n8t") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "kube-api-access-47n8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:32.271406 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.271383 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-out" (OuterVolumeSpecName: "config-out") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:19:32.271480 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.271410 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:32.271480 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.271431 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:32.271801 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.271777 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-volume" (OuterVolumeSpecName: "config-volume") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:32.271986 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.271968 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:32.274626 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.274605 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:32.280411 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.280389 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-web-config" (OuterVolumeSpecName: "web-config") pod "fe7f1595-026d-48d1-845e-1fcf8bc412ef" (UID: "fe7f1595-026d-48d1-845e-1fcf8bc412ef"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:32.368541 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368510 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368541 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368539 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368541 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368549 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-out\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368559 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-main-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368569 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-main-db\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368577 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1595-026d-48d1-845e-1fcf8bc412ef-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368586 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368594 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-config-volume\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368602 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47n8t\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-kube-api-access-47n8t\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368610 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-web-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368618 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fe7f1595-026d-48d1-845e-1fcf8bc412ef-cluster-tls-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.368682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.368625 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fe7f1595-026d-48d1-845e-1fcf8bc412ef-tls-assets\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:19:32.382512 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.382476 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerID="184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267" exitCode=0 Apr 24 21:19:32.382512 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.382510 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerID="6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23" exitCode=0 Apr 24 21:19:32.382632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.382553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerDied","Data":"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267"} Apr 24 21:19:32.382632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.382585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerDied","Data":"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23"} Apr 24 21:19:32.382632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.382592 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.382632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.382598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fe7f1595-026d-48d1-845e-1fcf8bc412ef","Type":"ContainerDied","Data":"74275d07d88932b84928f3d27ae52462cf0e231bff10f32a02013527cc984cde"} Apr 24 21:19:32.382632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.382618 2578 scope.go:117] "RemoveContainer" containerID="c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a" Apr 24 21:19:32.394723 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.394707 2578 scope.go:117] "RemoveContainer" containerID="184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267" Apr 24 21:19:32.401114 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.401098 2578 scope.go:117] "RemoveContainer" containerID="f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9" Apr 24 21:19:32.407030 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.407015 2578 scope.go:117] "RemoveContainer" containerID="6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23" Apr 24 21:19:32.409422 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.409400 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:32.412947 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.412927 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:32.414058 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.414043 2578 scope.go:117] "RemoveContainer" containerID="6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316" Apr 24 21:19:32.420168 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.420154 2578 scope.go:117] "RemoveContainer" containerID="747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9" Apr 24 21:19:32.426026 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.426009 2578 scope.go:117] "RemoveContainer" containerID="ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808" Apr 24 21:19:32.432025 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.431997 2578 scope.go:117] "RemoveContainer" containerID="c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a" Apr 24 21:19:32.432266 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:19:32.432248 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a\": container with ID starting with c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a not found: ID does not exist" containerID="c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a" Apr 24 21:19:32.432314 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.432274 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a"} err="failed to get container status \"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a\": rpc error: code = NotFound desc = could not find container \"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a\": container with ID starting with c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a not found: ID does not exist" Apr 24 21:19:32.432314 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.432292 2578 scope.go:117] "RemoveContainer" containerID="184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267" Apr 24 21:19:32.432507 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:19:32.432493 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267\": container with ID starting with 184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267 not found: ID does not exist" containerID="184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267" Apr 24 21:19:32.432544 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.432511 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267"} err="failed to get container status \"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267\": rpc error: code = NotFound desc = could not find container \"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267\": container with ID starting with 184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267 not found: ID does not exist" Apr 24 21:19:32.432544 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.432522 2578 scope.go:117] "RemoveContainer" containerID="f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9" Apr 24 21:19:32.432696 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:19:32.432681 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9\": container with ID starting with f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9 not found: ID does not exist" containerID="f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9" Apr 24 21:19:32.432739 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.432699 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9"} err="failed to get container status \"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9\": rpc error: code = NotFound desc = could not find container \"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9\": container with ID starting with f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9 not found: ID does not exist" Apr 24 21:19:32.432739 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.432710 2578 scope.go:117] "RemoveContainer" containerID="6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23" Apr 24 21:19:32.432951 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:19:32.432938 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23\": container with ID starting with 6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23 not found: ID does not exist" containerID="6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23" Apr 24 21:19:32.432984 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.432952 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23"} err="failed to get container status \"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23\": rpc error: code = NotFound desc = could not find container \"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23\": container with ID starting with 6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23 not found: ID does not exist" Apr 24 21:19:32.432984 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.432964 2578 scope.go:117] "RemoveContainer" containerID="6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316" Apr 24 21:19:32.433157 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:19:32.433142 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316\": container with ID starting with 6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316 not found: ID does not exist" containerID="6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316" Apr 24 21:19:32.433195 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433160 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316"} err="failed to get container status \"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316\": rpc error: code = NotFound desc = could not find container \"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316\": container with ID starting with 6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316 not found: ID does not exist" Apr 24 21:19:32.433195 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433171 2578 scope.go:117] "RemoveContainer" containerID="747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9" Apr 24 21:19:32.433334 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:19:32.433319 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9\": container with ID starting with 747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9 not found: ID does not exist" containerID="747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9" Apr 24 21:19:32.433370 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433337 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9"} err="failed to get container status \"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9\": rpc error: code = NotFound desc = could not find container \"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9\": container with ID starting with 747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9 not found: ID does not exist" Apr 24 21:19:32.433370 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433347 2578 scope.go:117] "RemoveContainer" containerID="ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808" Apr 24 21:19:32.433526 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:19:32.433513 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808\": container with ID starting with ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808 not found: ID does not exist" containerID="ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808" Apr 24 21:19:32.433564 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433527 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808"} err="failed to get container status \"ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808\": rpc error: code = NotFound desc = could not find container \"ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808\": container with ID starting with ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808 not found: ID does not exist" Apr 24 21:19:32.433564 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433539 2578 scope.go:117] "RemoveContainer" containerID="c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a" Apr 24 21:19:32.433779 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433739 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a"} err="failed to get container status \"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a\": rpc error: code = NotFound desc = could not find container \"c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a\": container with ID starting with c8c7b7fa48baf4d44e5f544cda36e007ff118903c31519525fa64b87fe99dd7a not found: ID does not exist" Apr 24 21:19:32.433833 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433780 2578 scope.go:117] "RemoveContainer" containerID="184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267" Apr 24 21:19:32.433995 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433976 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267"} err="failed to get container status \"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267\": rpc error: code = NotFound desc = could not find container \"184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267\": container with ID starting with 184f4e06b89fb18130fc1e2573772d75c9ad878b59f52f53b40323427db4e267 not found: ID does not exist" Apr 24 21:19:32.434033 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.433996 2578 scope.go:117] "RemoveContainer" containerID="f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9" Apr 24 21:19:32.434204 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.434184 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9"} err="failed to get container status \"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9\": rpc error: code = NotFound desc = could not find container \"f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9\": container with ID starting with f07bdf76312ebb44b8dac4ebf73ee81f81795e76ecb61104a6b2ccc4438213a9 not found: ID does not exist" Apr 24 21:19:32.434260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.434204 2578 scope.go:117] "RemoveContainer" containerID="6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23" Apr 24 21:19:32.434393 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.434378 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23"} err="failed to get container status \"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23\": rpc error: code = NotFound desc = could not find container \"6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23\": container with ID starting with 6ceade7dbdb35955eb2404daef010fa495c44ee90b77637c4f5d124a337b9e23 not found: ID does not exist" Apr 24 21:19:32.434441 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.434393 2578 scope.go:117] "RemoveContainer" containerID="6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316" Apr 24 21:19:32.434608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.434592 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316"} err="failed to get container status \"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316\": rpc error: code = NotFound desc = could not find container \"6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316\": container with ID starting with 6a0f4b8987410932cf0a02d12279fdbf12b8aace40453c72a28d8a2e26fc0316 not found: ID does not exist" Apr 24 21:19:32.434648 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.434608 2578 scope.go:117] "RemoveContainer" containerID="747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9" Apr 24 21:19:32.434806 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.434787 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9"} err="failed to get container status \"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9\": rpc error: code = NotFound desc = could not find container \"747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9\": container with ID starting with 747f9a62d406b94310e51b1747ae4ec169dc5dcdb2fef5b76fffd2733233efe9 not found: ID does not exist" Apr 24 21:19:32.434867 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.434806 2578 scope.go:117] "RemoveContainer" containerID="ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808" Apr 24 21:19:32.435035 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.435017 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808"} err="failed to get container status \"ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808\": rpc error: code = NotFound desc = could not find container \"ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808\": container with ID starting with ad869b8ad69b1cf0a8db21a270672d4c3a403f2ed43e4d21408b33b044f10808 not found: ID does not exist" Apr 24 21:19:32.440391 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440366 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:32.440670 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440659 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="prom-label-proxy" Apr 24 21:19:32.440713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440672 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="prom-label-proxy" Apr 24 21:19:32.440713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440682 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="config-reloader" Apr 24 21:19:32.440713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440689 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="config-reloader" Apr 24 21:19:32.440713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440700 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" containerName="console" Apr 24 21:19:32.440713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440705 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" containerName="console" Apr 24 21:19:32.440713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440713 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440719 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440726 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="init-config-reloader" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440732 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="init-config-reloader" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440737 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="alertmanager" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440741 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="alertmanager" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440783 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy-metric" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440788 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy-metric" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440794 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy-web" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440799 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy-web" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440841 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="alertmanager" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440849 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1f47a2a-86ee-4f10-bd8b-3f5db0e21bda" containerName="console" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440855 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440862 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy-metric" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440867 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="prom-label-proxy" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440874 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="config-reloader" Apr 24 21:19:32.440921 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.440881 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" containerName="kube-rbac-proxy-web" Apr 24 21:19:32.446223 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.446206 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.448496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448479 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:19:32.448496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448491 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:19:32.448597 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448494 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:19:32.448848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:19:32.448848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448825 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:19:32.448848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448817 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:19:32.449043 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448888 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-qn8sh\"" Apr 24 21:19:32.449043 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448839 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:19:32.449043 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.448825 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:19:32.454538 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.454517 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:19:32.455609 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.455589 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:32.570335 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-web-config\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570335 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570314 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/86042033-8eca-4738-a4bf-f31cc898ce69-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570468 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570468 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjqd\" (UniqueName: \"kubernetes.io/projected/86042033-8eca-4738-a4bf-f31cc898ce69-kube-api-access-wrjqd\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570468 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570433 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86042033-8eca-4738-a4bf-f31cc898ce69-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570468 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570611 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-config-volume\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570611 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/86042033-8eca-4738-a4bf-f31cc898ce69-config-out\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570611 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570611 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570611 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570591 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/86042033-8eca-4738-a4bf-f31cc898ce69-tls-assets\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570788 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.570788 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.570639 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86042033-8eca-4738-a4bf-f31cc898ce69-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.659879 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.659858 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe7f1595-026d-48d1-845e-1fcf8bc412ef" path="/var/lib/kubelet/pods/fe7f1595-026d-48d1-845e-1fcf8bc412ef/volumes" Apr 24 21:19:32.671514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86042033-8eca-4738-a4bf-f31cc898ce69-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671605 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671605 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-config-volume\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671605 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/86042033-8eca-4738-a4bf-f31cc898ce69-config-out\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671605 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671605 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671913 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/86042033-8eca-4738-a4bf-f31cc898ce69-tls-assets\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671913 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671913 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86042033-8eca-4738-a4bf-f31cc898ce69-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671913 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-web-config\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.671913 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/86042033-8eca-4738-a4bf-f31cc898ce69-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.672208 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.672208 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.671944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjqd\" (UniqueName: \"kubernetes.io/projected/86042033-8eca-4738-a4bf-f31cc898ce69-kube-api-access-wrjqd\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.672431 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.672401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86042033-8eca-4738-a4bf-f31cc898ce69-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.672556 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.672534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/86042033-8eca-4738-a4bf-f31cc898ce69-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.673105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.673053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86042033-8eca-4738-a4bf-f31cc898ce69-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.675012 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.674989 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/86042033-8eca-4738-a4bf-f31cc898ce69-config-out\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.675125 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.675071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.675194 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.675122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.675256 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.675194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-web-config\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.675256 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.675245 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/86042033-8eca-4738-a4bf-f31cc898ce69-tls-assets\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.675386 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.675268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-config-volume\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.676045 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.676021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.676132 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.676100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.676835 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.676816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/86042033-8eca-4738-a4bf-f31cc898ce69-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.680650 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.680631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjqd\" (UniqueName: \"kubernetes.io/projected/86042033-8eca-4738-a4bf-f31cc898ce69-kube-api-access-wrjqd\") pod \"alertmanager-main-0\" (UID: \"86042033-8eca-4738-a4bf-f31cc898ce69\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.756761 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.756735 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:32.878400 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:32.878373 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:32.880558 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:19:32.880529 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86042033_8eca_4738_a4bf_f31cc898ce69.slice/crio-4c7cefb35393de87e4612b031a53cf14cac29547131e7fb16150e8209d7c05da WatchSource:0}: Error finding container 4c7cefb35393de87e4612b031a53cf14cac29547131e7fb16150e8209d7c05da: Status 404 returned error can't find the container with id 4c7cefb35393de87e4612b031a53cf14cac29547131e7fb16150e8209d7c05da Apr 24 21:19:33.386204 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:33.386169 2578 generic.go:358] "Generic (PLEG): container finished" podID="86042033-8eca-4738-a4bf-f31cc898ce69" containerID="7a37ce7d9e48d331216c26874ed64b0c815c925d0fdfee3018000475b210c8b1" exitCode=0 Apr 24 21:19:33.386591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:33.386264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"86042033-8eca-4738-a4bf-f31cc898ce69","Type":"ContainerDied","Data":"7a37ce7d9e48d331216c26874ed64b0c815c925d0fdfee3018000475b210c8b1"} Apr 24 21:19:33.386591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:33.386307 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"86042033-8eca-4738-a4bf-f31cc898ce69","Type":"ContainerStarted","Data":"4c7cefb35393de87e4612b031a53cf14cac29547131e7fb16150e8209d7c05da"} Apr 24 21:19:34.396496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.396463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"86042033-8eca-4738-a4bf-f31cc898ce69","Type":"ContainerStarted","Data":"27de388d810b1721e595c8e2ec9ed7c0a5d834931bf2fccab0bb0b2d3ef3dcb7"} Apr 24 21:19:34.396496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.396497 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"86042033-8eca-4738-a4bf-f31cc898ce69","Type":"ContainerStarted","Data":"3c811ae324fb6514f9211bde7ace4b4d02a243b2981433b1a560b3cae059b546"} Apr 24 21:19:34.396926 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.396506 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"86042033-8eca-4738-a4bf-f31cc898ce69","Type":"ContainerStarted","Data":"1933ec64b211c34ba970fb4064d8ec3beb640c84ef8649e2aec477d04b3f16aa"} Apr 24 21:19:34.396926 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.396514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"86042033-8eca-4738-a4bf-f31cc898ce69","Type":"ContainerStarted","Data":"3282cdc440ec08defa4a86e2b96ca559597ff21ef3adb77ef8be7a57e3a2c416"} Apr 24 21:19:34.396926 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.396521 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"86042033-8eca-4738-a4bf-f31cc898ce69","Type":"ContainerStarted","Data":"cf984d31981980710b7fe64f9f6fce030d2053028d283780fd262ff45056df29"} Apr 24 21:19:34.396926 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.396529 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"86042033-8eca-4738-a4bf-f31cc898ce69","Type":"ContainerStarted","Data":"9e0862ad56c234aa92a6e8518b66172e40623e23ebab50748c439723b50274c3"} Apr 24 21:19:34.974197 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.974146 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.974128157 podStartE2EDuration="2.974128157s" podCreationTimestamp="2026-04-24 21:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:19:34.436654704 +0000 UTC m=+198.404456770" watchObservedRunningTime="2026-04-24 21:19:34.974128157 +0000 UTC m=+198.941930217" Apr 24 21:19:34.975191 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.975172 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-8667f847-f6cb2"] Apr 24 21:19:34.979036 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.979021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:34.981738 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.981714 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:19:34.981871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.981850 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:19:34.981871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.981860 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-vpqqk\"" Apr 24 21:19:34.982228 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.982208 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:19:34.982228 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.982223 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:19:34.982367 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.982213 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:19:34.988394 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.988368 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:19:34.989180 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:34.989158 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8667f847-f6cb2"] Apr 24 21:19:35.088843 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.088819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-telemeter-client-tls\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.088936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.088847 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.088936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.088870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hnr4\" (UniqueName: \"kubernetes.io/projected/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-kube-api-access-4hnr4\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.088936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.088893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-secret-telemeter-client\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.088936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.088927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.089078 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.088952 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-metrics-client-ca\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.089078 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.088972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-federate-client-tls\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.089078 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.089034 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-serving-certs-ca-bundle\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.189851 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.189828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-federate-client-tls\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.189936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.189856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-serving-certs-ca-bundle\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.189936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.189884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-telemeter-client-tls\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.189936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.189904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.189936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.189930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hnr4\" (UniqueName: \"kubernetes.io/projected/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-kube-api-access-4hnr4\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.190101 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.189962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-secret-telemeter-client\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.190101 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.189985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.190101 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.190023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-metrics-client-ca\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.190683 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.190564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-serving-certs-ca-bundle\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.190845 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.190822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-metrics-client-ca\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.191080 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.191059 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.192533 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.192509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.192678 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.192659 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-secret-telemeter-client\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.192994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.192976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-telemeter-client-tls\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.193068 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.193019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-federate-client-tls\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.197595 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.197574 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hnr4\" (UniqueName: \"kubernetes.io/projected/bb8263f1-b10d-461d-9d2e-d38fc7ff82f3-kube-api-access-4hnr4\") pod \"telemeter-client-8667f847-f6cb2\" (UID: \"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3\") " pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.290113 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.290060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" Apr 24 21:19:35.430712 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:35.430684 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8667f847-f6cb2"] Apr 24 21:19:35.432842 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:19:35.432809 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8263f1_b10d_461d_9d2e_d38fc7ff82f3.slice/crio-68b848f41a90507e5fb67471ad216d3028b08d7137f39c104e3cde11473c8879 WatchSource:0}: Error finding container 68b848f41a90507e5fb67471ad216d3028b08d7137f39c104e3cde11473c8879: Status 404 returned error can't find the container with id 68b848f41a90507e5fb67471ad216d3028b08d7137f39c104e3cde11473c8879 Apr 24 21:19:36.403622 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:36.403591 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" event={"ID":"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3","Type":"ContainerStarted","Data":"68b848f41a90507e5fb67471ad216d3028b08d7137f39c104e3cde11473c8879"} Apr 24 21:19:37.407811 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:37.407776 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" event={"ID":"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3","Type":"ContainerStarted","Data":"09f48755dba2982cdbcde3744592b80d78fb13462d42deaf942ebecaccf23f18"} Apr 24 21:19:37.408197 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:37.407816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" event={"ID":"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3","Type":"ContainerStarted","Data":"f63fae876771b0c249d8bfa5230c1c6359b8e7cbf40cab2ee0f5f60eab6e9f44"} Apr 24 21:19:37.408197 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:37.407830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" event={"ID":"bb8263f1-b10d-461d-9d2e-d38fc7ff82f3","Type":"ContainerStarted","Data":"e7eacac094bb3e328de01b4a2499b62743858983cd6c4f98daa9bf3370e93b2f"} Apr 24 21:19:37.429314 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:37.429270 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-8667f847-f6cb2" podStartSLOduration=1.732282352 podStartE2EDuration="3.429255821s" podCreationTimestamp="2026-04-24 21:19:34 +0000 UTC" firstStartedPulling="2026-04-24 21:19:35.434793848 +0000 UTC m=+199.402595888" lastFinishedPulling="2026-04-24 21:19:37.131767314 +0000 UTC m=+201.099569357" observedRunningTime="2026-04-24 21:19:37.427590707 +0000 UTC m=+201.395392963" watchObservedRunningTime="2026-04-24 21:19:37.429255821 +0000 UTC m=+201.397057881" Apr 24 21:19:38.191960 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.191929 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bb7c4f64c-pvhr4"] Apr 24 21:19:38.195089 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.195074 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.207619 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.207596 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bb7c4f64c-pvhr4"] Apr 24 21:19:38.311869 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.311846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-trusted-ca-bundle\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.311993 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.311878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-oauth-config\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.311993 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.311912 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-console-config\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.312100 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.311992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-service-ca\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.312100 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.312042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfj2l\" (UniqueName: \"kubernetes.io/projected/510777fb-15bf-438e-8c19-81e2c44d9747-kube-api-access-tfj2l\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.312100 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.312088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-serving-cert\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.312201 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.312107 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-oauth-serving-cert\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.412876 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.412852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-console-config\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413198 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.412879 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-service-ca\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413198 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.412899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfj2l\" (UniqueName: \"kubernetes.io/projected/510777fb-15bf-438e-8c19-81e2c44d9747-kube-api-access-tfj2l\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413198 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.413021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-serving-cert\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413198 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.413052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-oauth-serving-cert\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413198 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.413116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-trusted-ca-bundle\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413198 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.413145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-oauth-config\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.413573 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-service-ca\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413671 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.413594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-console-config\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.413875 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.413683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-oauth-serving-cert\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.414094 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.414074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-trusted-ca-bundle\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.415577 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.415559 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-oauth-config\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.415696 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.415681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-serving-cert\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.433385 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.433364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfj2l\" (UniqueName: \"kubernetes.io/projected/510777fb-15bf-438e-8c19-81e2c44d9747-kube-api-access-tfj2l\") pod \"console-6bb7c4f64c-pvhr4\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.504136 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.504077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:38.618546 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:38.618517 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bb7c4f64c-pvhr4"] Apr 24 21:19:38.621445 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:19:38.621419 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod510777fb_15bf_438e_8c19_81e2c44d9747.slice/crio-a6515cf1f7eb042b68b9ce206c3d47cabea886930a9dc4d81e5dd97c772479d5 WatchSource:0}: Error finding container a6515cf1f7eb042b68b9ce206c3d47cabea886930a9dc4d81e5dd97c772479d5: Status 404 returned error can't find the container with id a6515cf1f7eb042b68b9ce206c3d47cabea886930a9dc4d81e5dd97c772479d5 Apr 24 21:19:39.415550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:39.415509 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb7c4f64c-pvhr4" event={"ID":"510777fb-15bf-438e-8c19-81e2c44d9747","Type":"ContainerStarted","Data":"4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff"} Apr 24 21:19:39.415906 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:39.415557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb7c4f64c-pvhr4" event={"ID":"510777fb-15bf-438e-8c19-81e2c44d9747","Type":"ContainerStarted","Data":"a6515cf1f7eb042b68b9ce206c3d47cabea886930a9dc4d81e5dd97c772479d5"} Apr 24 21:19:39.432789 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:39.432720 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bb7c4f64c-pvhr4" podStartSLOduration=1.432705261 podStartE2EDuration="1.432705261s" podCreationTimestamp="2026-04-24 21:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:19:39.430530033 +0000 UTC m=+203.398332091" watchObservedRunningTime="2026-04-24 21:19:39.432705261 +0000 UTC m=+203.400507323" Apr 24 21:19:48.504852 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:48.504818 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:48.505236 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:48.504894 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:48.509306 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:48.509287 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:49.446268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:49.446244 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:19:49.493995 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:19:49.493953 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67f946669c-79tgc"] Apr 24 21:20:14.516955 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.516921 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67f946669c-79tgc" podUID="5c925852-4ea8-4bb0-8044-486ad660258d" containerName="console" containerID="cri-o://11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d" gracePeriod=15 Apr 24 21:20:14.778105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.778082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67f946669c-79tgc_5c925852-4ea8-4bb0-8044-486ad660258d/console/0.log" Apr 24 21:20:14.778203 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.778141 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:20:14.873599 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.873569 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-service-ca\") pod \"5c925852-4ea8-4bb0-8044-486ad660258d\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " Apr 24 21:20:14.873728 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.873606 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-console-config\") pod \"5c925852-4ea8-4bb0-8044-486ad660258d\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " Apr 24 21:20:14.873728 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.873624 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-trusted-ca-bundle\") pod \"5c925852-4ea8-4bb0-8044-486ad660258d\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " Apr 24 21:20:14.873728 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.873663 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-oauth-config\") pod \"5c925852-4ea8-4bb0-8044-486ad660258d\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " Apr 24 21:20:14.873922 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.873801 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-serving-cert\") pod \"5c925852-4ea8-4bb0-8044-486ad660258d\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " Apr 24 21:20:14.873922 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.873853 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpdls\" (UniqueName: \"kubernetes.io/projected/5c925852-4ea8-4bb0-8044-486ad660258d-kube-api-access-bpdls\") pod \"5c925852-4ea8-4bb0-8044-486ad660258d\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " Apr 24 21:20:14.873922 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.873893 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-oauth-serving-cert\") pod \"5c925852-4ea8-4bb0-8044-486ad660258d\" (UID: \"5c925852-4ea8-4bb0-8044-486ad660258d\") " Apr 24 21:20:14.874116 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.874062 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-service-ca" (OuterVolumeSpecName: "service-ca") pod "5c925852-4ea8-4bb0-8044-486ad660258d" (UID: "5c925852-4ea8-4bb0-8044-486ad660258d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:20:14.874116 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.874097 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-console-config" (OuterVolumeSpecName: "console-config") pod "5c925852-4ea8-4bb0-8044-486ad660258d" (UID: "5c925852-4ea8-4bb0-8044-486ad660258d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:20:14.874303 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.874178 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-service-ca\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:20:14.874303 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.874202 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5c925852-4ea8-4bb0-8044-486ad660258d" (UID: "5c925852-4ea8-4bb0-8044-486ad660258d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:20:14.874303 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.874228 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-console-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:20:14.874418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.874345 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5c925852-4ea8-4bb0-8044-486ad660258d" (UID: "5c925852-4ea8-4bb0-8044-486ad660258d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:20:14.875922 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.875906 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5c925852-4ea8-4bb0-8044-486ad660258d" (UID: "5c925852-4ea8-4bb0-8044-486ad660258d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:20:14.876455 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.876430 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5c925852-4ea8-4bb0-8044-486ad660258d" (UID: "5c925852-4ea8-4bb0-8044-486ad660258d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:20:14.876526 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.876457 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c925852-4ea8-4bb0-8044-486ad660258d-kube-api-access-bpdls" (OuterVolumeSpecName: "kube-api-access-bpdls") pod "5c925852-4ea8-4bb0-8044-486ad660258d" (UID: "5c925852-4ea8-4bb0-8044-486ad660258d"). InnerVolumeSpecName "kube-api-access-bpdls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:20:14.975105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.975082 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-oauth-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:20:14.975105 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.975103 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c925852-4ea8-4bb0-8044-486ad660258d-console-serving-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:20:14.975219 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.975113 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpdls\" (UniqueName: \"kubernetes.io/projected/5c925852-4ea8-4bb0-8044-486ad660258d-kube-api-access-bpdls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:20:14.975219 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.975122 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-oauth-serving-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:20:14.975219 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:14.975131 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c925852-4ea8-4bb0-8044-486ad660258d-trusted-ca-bundle\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:20:15.517054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.517029 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67f946669c-79tgc_5c925852-4ea8-4bb0-8044-486ad660258d/console/0.log" Apr 24 21:20:15.517395 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.517071 2578 generic.go:358] "Generic (PLEG): container finished" podID="5c925852-4ea8-4bb0-8044-486ad660258d" containerID="11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d" exitCode=2 Apr 24 21:20:15.517395 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.517128 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67f946669c-79tgc" Apr 24 21:20:15.517395 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.517165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67f946669c-79tgc" event={"ID":"5c925852-4ea8-4bb0-8044-486ad660258d","Type":"ContainerDied","Data":"11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d"} Apr 24 21:20:15.517395 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.517208 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67f946669c-79tgc" event={"ID":"5c925852-4ea8-4bb0-8044-486ad660258d","Type":"ContainerDied","Data":"8b4bad5b5680fd068e90b21cf1584e80b369a9d7df8faeb989f723cb30444788"} Apr 24 21:20:15.517395 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.517224 2578 scope.go:117] "RemoveContainer" containerID="11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d" Apr 24 21:20:15.525181 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.525166 2578 scope.go:117] "RemoveContainer" containerID="11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d" Apr 24 21:20:15.525396 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:20:15.525381 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d\": container with ID starting with 11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d not found: ID does not exist" containerID="11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d" Apr 24 21:20:15.525432 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.525403 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d"} err="failed to get container status \"11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d\": rpc error: code = NotFound desc = could not find container \"11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d\": container with ID starting with 11e1fa61e20e7315ecc685f34d678478c2ea9fee0f6d4d4e040bb6f93800ea1d not found: ID does not exist" Apr 24 21:20:15.537323 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.537292 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67f946669c-79tgc"] Apr 24 21:20:15.544528 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:15.544507 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67f946669c-79tgc"] Apr 24 21:20:16.658819 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:16.658785 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c925852-4ea8-4bb0-8044-486ad660258d" path="/var/lib/kubelet/pods/5c925852-4ea8-4bb0-8044-486ad660258d/volumes" Apr 24 21:20:53.260347 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.260314 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-99c44456d-jq9fv"] Apr 24 21:20:53.260723 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.260622 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c925852-4ea8-4bb0-8044-486ad660258d" containerName="console" Apr 24 21:20:53.260723 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.260636 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c925852-4ea8-4bb0-8044-486ad660258d" containerName="console" Apr 24 21:20:53.260723 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.260698 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c925852-4ea8-4bb0-8044-486ad660258d" containerName="console" Apr 24 21:20:53.264660 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.264641 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.277313 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.277293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-99c44456d-jq9fv"] Apr 24 21:20:53.332872 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.332843 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-oauth-config\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.332999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.332883 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-config\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.332999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.332909 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-service-ca\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.332999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.332954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-oauth-serving-cert\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.332999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.332996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd49t\" (UniqueName: \"kubernetes.io/projected/4982c561-07c4-461e-a7c8-9f4e7b7406e6-kube-api-access-jd49t\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.333199 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.333029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-serving-cert\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.333199 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.333110 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-trusted-ca-bundle\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.433777 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.433740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-serving-cert\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.433871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.433802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-trusted-ca-bundle\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.433871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.433826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-oauth-config\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.433871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.433842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-config\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.433871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.433858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-service-ca\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.434070 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.433874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-oauth-serving-cert\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.434070 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.433900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd49t\" (UniqueName: \"kubernetes.io/projected/4982c561-07c4-461e-a7c8-9f4e7b7406e6-kube-api-access-jd49t\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.434708 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.434681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-config\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.434829 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.434681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-service-ca\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.434829 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.434779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-oauth-serving-cert\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.434829 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.434813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-trusted-ca-bundle\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.436290 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.436263 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-serving-cert\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.436384 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.436339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-oauth-config\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.445352 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.445332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd49t\" (UniqueName: \"kubernetes.io/projected/4982c561-07c4-461e-a7c8-9f4e7b7406e6-kube-api-access-jd49t\") pod \"console-99c44456d-jq9fv\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.573486 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.573431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:20:53.689402 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:53.689365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-99c44456d-jq9fv"] Apr 24 21:20:53.692224 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:20:53.692190 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4982c561_07c4_461e_a7c8_9f4e7b7406e6.slice/crio-0cf8a260678c50983bf464b61eb94bbec059f4f2843cf0e66e093a87a45b7836 WatchSource:0}: Error finding container 0cf8a260678c50983bf464b61eb94bbec059f4f2843cf0e66e093a87a45b7836: Status 404 returned error can't find the container with id 0cf8a260678c50983bf464b61eb94bbec059f4f2843cf0e66e093a87a45b7836 Apr 24 21:20:54.627785 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:54.627730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99c44456d-jq9fv" event={"ID":"4982c561-07c4-461e-a7c8-9f4e7b7406e6","Type":"ContainerStarted","Data":"148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a"} Apr 24 21:20:54.627785 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:54.627785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99c44456d-jq9fv" event={"ID":"4982c561-07c4-461e-a7c8-9f4e7b7406e6","Type":"ContainerStarted","Data":"0cf8a260678c50983bf464b61eb94bbec059f4f2843cf0e66e093a87a45b7836"} Apr 24 21:20:54.659160 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:20:54.659115 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-99c44456d-jq9fv" podStartSLOduration=1.659102287 podStartE2EDuration="1.659102287s" podCreationTimestamp="2026-04-24 21:20:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:20:54.658100391 +0000 UTC m=+278.625902452" watchObservedRunningTime="2026-04-24 21:20:54.659102287 +0000 UTC m=+278.626904349" Apr 24 21:21:03.574092 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:03.574059 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:21:03.574474 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:03.574104 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:21:03.578593 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:03.578571 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:21:03.657415 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:03.657392 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:21:03.701488 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:03.701452 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bb7c4f64c-pvhr4"] Apr 24 21:21:16.538635 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:16.538605 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:21:16.540020 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:16.539995 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:21:16.542911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:16.542893 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:21:28.721202 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:28.721158 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bb7c4f64c-pvhr4" podUID="510777fb-15bf-438e-8c19-81e2c44d9747" containerName="console" containerID="cri-o://4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff" gracePeriod=15 Apr 24 21:21:28.954301 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:28.954281 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bb7c4f64c-pvhr4_510777fb-15bf-438e-8c19-81e2c44d9747/console/0.log" Apr 24 21:21:28.954397 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:28.954343 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:21:29.089968 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.089880 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-serving-cert\") pod \"510777fb-15bf-438e-8c19-81e2c44d9747\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " Apr 24 21:21:29.089968 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.089946 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-service-ca\") pod \"510777fb-15bf-438e-8c19-81e2c44d9747\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " Apr 24 21:21:29.089968 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.089976 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfj2l\" (UniqueName: \"kubernetes.io/projected/510777fb-15bf-438e-8c19-81e2c44d9747-kube-api-access-tfj2l\") pod \"510777fb-15bf-438e-8c19-81e2c44d9747\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " Apr 24 21:21:29.090236 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.090002 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-trusted-ca-bundle\") pod \"510777fb-15bf-438e-8c19-81e2c44d9747\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " Apr 24 21:21:29.090236 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.090020 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-oauth-serving-cert\") pod \"510777fb-15bf-438e-8c19-81e2c44d9747\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " Apr 24 21:21:29.090236 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.090058 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-console-config\") pod \"510777fb-15bf-438e-8c19-81e2c44d9747\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " Apr 24 21:21:29.090236 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.090099 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-oauth-config\") pod \"510777fb-15bf-438e-8c19-81e2c44d9747\" (UID: \"510777fb-15bf-438e-8c19-81e2c44d9747\") " Apr 24 21:21:29.090439 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.090379 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "510777fb-15bf-438e-8c19-81e2c44d9747" (UID: "510777fb-15bf-438e-8c19-81e2c44d9747"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:21:29.090487 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.090455 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-service-ca" (OuterVolumeSpecName: "service-ca") pod "510777fb-15bf-438e-8c19-81e2c44d9747" (UID: "510777fb-15bf-438e-8c19-81e2c44d9747"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:21:29.090487 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.090464 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "510777fb-15bf-438e-8c19-81e2c44d9747" (UID: "510777fb-15bf-438e-8c19-81e2c44d9747"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:21:29.090487 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.090471 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-console-config" (OuterVolumeSpecName: "console-config") pod "510777fb-15bf-438e-8c19-81e2c44d9747" (UID: "510777fb-15bf-438e-8c19-81e2c44d9747"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:21:29.092174 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.092140 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "510777fb-15bf-438e-8c19-81e2c44d9747" (UID: "510777fb-15bf-438e-8c19-81e2c44d9747"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:21:29.092274 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.092248 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/510777fb-15bf-438e-8c19-81e2c44d9747-kube-api-access-tfj2l" (OuterVolumeSpecName: "kube-api-access-tfj2l") pod "510777fb-15bf-438e-8c19-81e2c44d9747" (UID: "510777fb-15bf-438e-8c19-81e2c44d9747"). InnerVolumeSpecName "kube-api-access-tfj2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:21:29.092313 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.092285 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "510777fb-15bf-438e-8c19-81e2c44d9747" (UID: "510777fb-15bf-438e-8c19-81e2c44d9747"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:21:29.190831 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.190789 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfj2l\" (UniqueName: \"kubernetes.io/projected/510777fb-15bf-438e-8c19-81e2c44d9747-kube-api-access-tfj2l\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:21:29.190831 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.190825 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-trusted-ca-bundle\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:21:29.190831 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.190834 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-oauth-serving-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:21:29.190831 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.190845 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-console-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:21:29.191063 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.190855 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-oauth-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:21:29.191063 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.190865 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/510777fb-15bf-438e-8c19-81e2c44d9747-console-serving-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:21:29.191063 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.190873 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/510777fb-15bf-438e-8c19-81e2c44d9747-service-ca\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:21:29.728102 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.728074 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bb7c4f64c-pvhr4_510777fb-15bf-438e-8c19-81e2c44d9747/console/0.log" Apr 24 21:21:29.728450 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.728115 2578 generic.go:358] "Generic (PLEG): container finished" podID="510777fb-15bf-438e-8c19-81e2c44d9747" containerID="4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff" exitCode=2 Apr 24 21:21:29.728450 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.728185 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb7c4f64c-pvhr4" Apr 24 21:21:29.728450 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.728201 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb7c4f64c-pvhr4" event={"ID":"510777fb-15bf-438e-8c19-81e2c44d9747","Type":"ContainerDied","Data":"4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff"} Apr 24 21:21:29.728450 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.728233 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb7c4f64c-pvhr4" event={"ID":"510777fb-15bf-438e-8c19-81e2c44d9747","Type":"ContainerDied","Data":"a6515cf1f7eb042b68b9ce206c3d47cabea886930a9dc4d81e5dd97c772479d5"} Apr 24 21:21:29.728450 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.728249 2578 scope.go:117] "RemoveContainer" containerID="4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff" Apr 24 21:21:29.736740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.736723 2578 scope.go:117] "RemoveContainer" containerID="4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff" Apr 24 21:21:29.737022 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:21:29.737003 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff\": container with ID starting with 4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff not found: ID does not exist" containerID="4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff" Apr 24 21:21:29.737080 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.737030 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff"} err="failed to get container status \"4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff\": rpc error: code = NotFound desc = could not find container \"4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff\": container with ID starting with 4bdc0588e0b767fe11ce7cda87664b51e8a910002844a11ed66a2cc2070860ff not found: ID does not exist" Apr 24 21:21:29.748098 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.748077 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bb7c4f64c-pvhr4"] Apr 24 21:21:29.753616 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:29.753594 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bb7c4f64c-pvhr4"] Apr 24 21:21:30.660112 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:30.660068 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="510777fb-15bf-438e-8c19-81e2c44d9747" path="/var/lib/kubelet/pods/510777fb-15bf-438e-8c19-81e2c44d9747/volumes" Apr 24 21:21:33.807913 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.807877 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-44jqp"] Apr 24 21:21:33.808293 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.808183 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="510777fb-15bf-438e-8c19-81e2c44d9747" containerName="console" Apr 24 21:21:33.808293 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.808193 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="510777fb-15bf-438e-8c19-81e2c44d9747" containerName="console" Apr 24 21:21:33.808293 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.808244 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="510777fb-15bf-438e-8c19-81e2c44d9747" containerName="console" Apr 24 21:21:33.812461 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.812442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:33.815125 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.815106 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:21:33.820020 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.819999 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-44jqp"] Apr 24 21:21:33.932129 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.932094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e2296a6c-7782-4207-85a9-4737aabd81e8-dbus\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:33.932129 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.932129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e2296a6c-7782-4207-85a9-4737aabd81e8-original-pull-secret\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:33.932310 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:33.932154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e2296a6c-7782-4207-85a9-4737aabd81e8-kubelet-config\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:34.032710 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.032669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e2296a6c-7782-4207-85a9-4737aabd81e8-kubelet-config\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:34.032894 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.032761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e2296a6c-7782-4207-85a9-4737aabd81e8-dbus\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:34.032894 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.032781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e2296a6c-7782-4207-85a9-4737aabd81e8-original-pull-secret\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:34.032894 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.032818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e2296a6c-7782-4207-85a9-4737aabd81e8-kubelet-config\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:34.032992 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.032893 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e2296a6c-7782-4207-85a9-4737aabd81e8-dbus\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:34.035131 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.035109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e2296a6c-7782-4207-85a9-4737aabd81e8-original-pull-secret\") pod \"global-pull-secret-syncer-44jqp\" (UID: \"e2296a6c-7782-4207-85a9-4737aabd81e8\") " pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:34.121766 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.121668 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-44jqp" Apr 24 21:21:34.237879 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.237798 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-44jqp"] Apr 24 21:21:34.240485 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:21:34.240459 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2296a6c_7782_4207_85a9_4737aabd81e8.slice/crio-a44c14be4d58e2fd6aaeb207c5e761f23b6a223eb564beb2a45211432db70a31 WatchSource:0}: Error finding container a44c14be4d58e2fd6aaeb207c5e761f23b6a223eb564beb2a45211432db70a31: Status 404 returned error can't find the container with id a44c14be4d58e2fd6aaeb207c5e761f23b6a223eb564beb2a45211432db70a31 Apr 24 21:21:34.241987 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.241970 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:21:34.743375 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:34.743342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-44jqp" event={"ID":"e2296a6c-7782-4207-85a9-4737aabd81e8","Type":"ContainerStarted","Data":"a44c14be4d58e2fd6aaeb207c5e761f23b6a223eb564beb2a45211432db70a31"} Apr 24 21:21:38.756588 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:38.756551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-44jqp" event={"ID":"e2296a6c-7782-4207-85a9-4737aabd81e8","Type":"ContainerStarted","Data":"c5ef02845f4f0b4c9a707e23ae6d3c925ba50acf6901a62141c7c6862d3a76a0"} Apr 24 21:21:38.781097 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:21:38.781053 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-44jqp" podStartSLOduration=2.210121419 podStartE2EDuration="5.781040113s" podCreationTimestamp="2026-04-24 21:21:33 +0000 UTC" firstStartedPulling="2026-04-24 21:21:34.242094482 +0000 UTC m=+318.209896522" lastFinishedPulling="2026-04-24 21:21:37.813013175 +0000 UTC m=+321.780815216" observedRunningTime="2026-04-24 21:21:38.780074439 +0000 UTC m=+322.747876499" watchObservedRunningTime="2026-04-24 21:21:38.781040113 +0000 UTC m=+322.748842174" Apr 24 21:22:22.450151 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.450120 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-59svc"] Apr 24 21:22:22.453367 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.453350 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:22.455669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.455643 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-64sjn\"" Apr 24 21:22:22.455813 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.455793 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:22:22.455874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.455797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:22:22.456462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.456446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:22:22.456526 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.456454 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:22:22.456526 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.456504 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:22:22.463908 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.463886 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-59svc"] Apr 24 21:22:22.480242 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.480217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/4df54f97-66de-4068-9abd-86c7cf56078b-cabundle0\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:22.480344 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.480271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtt97\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-kube-api-access-jtt97\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:22.480344 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.480299 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:22.581180 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.581154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtt97\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-kube-api-access-jtt97\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:22.581295 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.581183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:22.581295 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.581215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/4df54f97-66de-4068-9abd-86c7cf56078b-cabundle0\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:22.581407 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:22.581297 2578 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 21:22:22.581407 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:22.581316 2578 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:22:22.581407 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:22.581323 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:22:22.581407 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:22.581336 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-59svc: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:22:22.581407 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:22.581390 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates podName:4df54f97-66de-4068-9abd-86c7cf56078b nodeName:}" failed. No retries permitted until 2026-04-24 21:22:23.081374817 +0000 UTC m=+367.049176862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates") pod "keda-operator-ffbb595cb-59svc" (UID: "4df54f97-66de-4068-9abd-86c7cf56078b") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:22:22.581723 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.581707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/4df54f97-66de-4068-9abd-86c7cf56078b-cabundle0\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:22.593658 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:22.593636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtt97\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-kube-api-access-jtt97\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:23.085093 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:23.085065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:23.085240 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:23.085195 2578 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:22:23.085240 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:23.085213 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:22:23.085240 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:23.085224 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-59svc: references non-existent secret key: ca.crt Apr 24 21:22:23.085370 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:23.085296 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates podName:4df54f97-66de-4068-9abd-86c7cf56078b nodeName:}" failed. No retries permitted until 2026-04-24 21:22:24.085280739 +0000 UTC m=+368.053082779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates") pod "keda-operator-ffbb595cb-59svc" (UID: "4df54f97-66de-4068-9abd-86c7cf56078b") : references non-existent secret key: ca.crt Apr 24 21:22:24.094486 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:24.094448 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:24.094853 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:24.094584 2578 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:22:24.094853 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:24.094602 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:22:24.094853 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:24.094611 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-59svc: references non-existent secret key: ca.crt Apr 24 21:22:24.094853 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:24.094672 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates podName:4df54f97-66de-4068-9abd-86c7cf56078b nodeName:}" failed. No retries permitted until 2026-04-24 21:22:26.094658305 +0000 UTC m=+370.062460349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates") pod "keda-operator-ffbb595cb-59svc" (UID: "4df54f97-66de-4068-9abd-86c7cf56078b") : references non-existent secret key: ca.crt Apr 24 21:22:26.108764 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:26.108726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:26.109126 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:26.108858 2578 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:22:26.109126 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:26.108874 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:22:26.109126 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:26.108882 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-59svc: references non-existent secret key: ca.crt Apr 24 21:22:26.109126 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:22:26.108925 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates podName:4df54f97-66de-4068-9abd-86c7cf56078b nodeName:}" failed. No retries permitted until 2026-04-24 21:22:30.108911974 +0000 UTC m=+374.076714014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates") pod "keda-operator-ffbb595cb-59svc" (UID: "4df54f97-66de-4068-9abd-86c7cf56078b") : references non-existent secret key: ca.crt Apr 24 21:22:30.134460 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:30.134427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:30.136855 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:30.136836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4df54f97-66de-4068-9abd-86c7cf56078b-certificates\") pod \"keda-operator-ffbb595cb-59svc\" (UID: \"4df54f97-66de-4068-9abd-86c7cf56078b\") " pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:30.264724 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:30.264699 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:30.390264 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:30.390189 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-59svc"] Apr 24 21:22:30.393019 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:22:30.392989 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4df54f97_66de_4068_9abd_86c7cf56078b.slice/crio-d8bda29c43304c08c605f6bacd407290ef80df2ef3bb83e31d8d0e85f21bb214 WatchSource:0}: Error finding container d8bda29c43304c08c605f6bacd407290ef80df2ef3bb83e31d8d0e85f21bb214: Status 404 returned error can't find the container with id d8bda29c43304c08c605f6bacd407290ef80df2ef3bb83e31d8d0e85f21bb214 Apr 24 21:22:30.905792 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:30.905742 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-59svc" event={"ID":"4df54f97-66de-4068-9abd-86c7cf56078b","Type":"ContainerStarted","Data":"d8bda29c43304c08c605f6bacd407290ef80df2ef3bb83e31d8d0e85f21bb214"} Apr 24 21:22:33.917007 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:33.916965 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-59svc" event={"ID":"4df54f97-66de-4068-9abd-86c7cf56078b","Type":"ContainerStarted","Data":"87ab01f3eb88241ce90b8a875ef13899eaf20ff3e286a3af325d7158a5cb7c8e"} Apr 24 21:22:33.917351 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:33.917125 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:22:33.935215 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:33.935167 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-59svc" podStartSLOduration=9.013055764 podStartE2EDuration="11.935151833s" podCreationTimestamp="2026-04-24 21:22:22 +0000 UTC" firstStartedPulling="2026-04-24 21:22:30.394148663 +0000 UTC m=+374.361950703" lastFinishedPulling="2026-04-24 21:22:33.316244729 +0000 UTC m=+377.284046772" observedRunningTime="2026-04-24 21:22:33.933656751 +0000 UTC m=+377.901458812" watchObservedRunningTime="2026-04-24 21:22:33.935151833 +0000 UTC m=+377.902953895" Apr 24 21:22:54.922471 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:22:54.922444 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-59svc" Apr 24 21:23:32.419944 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.419867 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-lgpwz"] Apr 24 21:23:32.427916 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.427883 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rq55d"] Apr 24 21:23:32.428081 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.428060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:32.430891 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.430870 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:23:32.431428 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.431411 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:32.431835 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.431815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-lrmb7\"" Apr 24 21:23:32.431946 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.431875 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:23:32.431946 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.431878 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:23:32.433632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.433482 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:23:32.433944 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.433925 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-4qmcx\"" Apr 24 21:23:32.435578 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.435481 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-lgpwz"] Apr 24 21:23:32.437994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.437973 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rq55d"] Apr 24 21:23:32.582620 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.582588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06fbf7df-838a-4579-9dbf-44959637e896-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rq55d\" (UID: \"06fbf7df-838a-4579-9dbf-44959637e896\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:32.582780 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.582635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsnd4\" (UniqueName: \"kubernetes.io/projected/10d5fc69-d500-40da-ab21-9252bde96100-kube-api-access-nsnd4\") pod \"kserve-controller-manager-74fc8f6f96-lgpwz\" (UID: \"10d5fc69-d500-40da-ab21-9252bde96100\") " pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:32.582780 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.582711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10d5fc69-d500-40da-ab21-9252bde96100-cert\") pod \"kserve-controller-manager-74fc8f6f96-lgpwz\" (UID: \"10d5fc69-d500-40da-ab21-9252bde96100\") " pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:32.582878 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.582780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjcj9\" (UniqueName: \"kubernetes.io/projected/06fbf7df-838a-4579-9dbf-44959637e896-kube-api-access-cjcj9\") pod \"llmisvc-controller-manager-68cc5db7c4-rq55d\" (UID: \"06fbf7df-838a-4579-9dbf-44959637e896\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:32.683991 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.683924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsnd4\" (UniqueName: \"kubernetes.io/projected/10d5fc69-d500-40da-ab21-9252bde96100-kube-api-access-nsnd4\") pod \"kserve-controller-manager-74fc8f6f96-lgpwz\" (UID: \"10d5fc69-d500-40da-ab21-9252bde96100\") " pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:32.683991 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.683964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10d5fc69-d500-40da-ab21-9252bde96100-cert\") pod \"kserve-controller-manager-74fc8f6f96-lgpwz\" (UID: \"10d5fc69-d500-40da-ab21-9252bde96100\") " pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:32.683991 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.683984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjcj9\" (UniqueName: \"kubernetes.io/projected/06fbf7df-838a-4579-9dbf-44959637e896-kube-api-access-cjcj9\") pod \"llmisvc-controller-manager-68cc5db7c4-rq55d\" (UID: \"06fbf7df-838a-4579-9dbf-44959637e896\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:32.684222 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.684028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06fbf7df-838a-4579-9dbf-44959637e896-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rq55d\" (UID: \"06fbf7df-838a-4579-9dbf-44959637e896\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:32.686321 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.686292 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06fbf7df-838a-4579-9dbf-44959637e896-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rq55d\" (UID: \"06fbf7df-838a-4579-9dbf-44959637e896\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:32.686433 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.686350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10d5fc69-d500-40da-ab21-9252bde96100-cert\") pod \"kserve-controller-manager-74fc8f6f96-lgpwz\" (UID: \"10d5fc69-d500-40da-ab21-9252bde96100\") " pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:32.693130 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.693111 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjcj9\" (UniqueName: \"kubernetes.io/projected/06fbf7df-838a-4579-9dbf-44959637e896-kube-api-access-cjcj9\") pod \"llmisvc-controller-manager-68cc5db7c4-rq55d\" (UID: \"06fbf7df-838a-4579-9dbf-44959637e896\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:32.693435 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.693414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsnd4\" (UniqueName: \"kubernetes.io/projected/10d5fc69-d500-40da-ab21-9252bde96100-kube-api-access-nsnd4\") pod \"kserve-controller-manager-74fc8f6f96-lgpwz\" (UID: \"10d5fc69-d500-40da-ab21-9252bde96100\") " pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:32.741736 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.741711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:32.748375 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.748355 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:32.873505 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.873472 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-lgpwz"] Apr 24 21:23:32.876693 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:23:32.876667 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d5fc69_d500_40da_ab21_9252bde96100.slice/crio-312effae72dd3b396d829329cac8e43c08736f84365a404bbdb5ec3584939b2f WatchSource:0}: Error finding container 312effae72dd3b396d829329cac8e43c08736f84365a404bbdb5ec3584939b2f: Status 404 returned error can't find the container with id 312effae72dd3b396d829329cac8e43c08736f84365a404bbdb5ec3584939b2f Apr 24 21:23:32.891691 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:32.891673 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rq55d"] Apr 24 21:23:32.893351 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:23:32.893327 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod06fbf7df_838a_4579_9dbf_44959637e896.slice/crio-b88eb71a571394044d7d71efb432df4d5d157c7c4940256c9003c54f9f086bfb WatchSource:0}: Error finding container b88eb71a571394044d7d71efb432df4d5d157c7c4940256c9003c54f9f086bfb: Status 404 returned error can't find the container with id b88eb71a571394044d7d71efb432df4d5d157c7c4940256c9003c54f9f086bfb Apr 24 21:23:33.078579 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:33.078512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" event={"ID":"06fbf7df-838a-4579-9dbf-44959637e896","Type":"ContainerStarted","Data":"b88eb71a571394044d7d71efb432df4d5d157c7c4940256c9003c54f9f086bfb"} Apr 24 21:23:33.079591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:33.079569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" event={"ID":"10d5fc69-d500-40da-ab21-9252bde96100","Type":"ContainerStarted","Data":"312effae72dd3b396d829329cac8e43c08736f84365a404bbdb5ec3584939b2f"} Apr 24 21:23:36.090783 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:36.090730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" event={"ID":"06fbf7df-838a-4579-9dbf-44959637e896","Type":"ContainerStarted","Data":"47a452c409185fc025b57fbc6ee0e6fd865b4588d1a6cb3064e5df3f9cbe305b"} Apr 24 21:23:36.091163 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:36.090867 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:23:36.091965 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:36.091941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" event={"ID":"10d5fc69-d500-40da-ab21-9252bde96100","Type":"ContainerStarted","Data":"249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d"} Apr 24 21:23:36.092084 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:36.092048 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:23:36.113527 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:36.110422 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" podStartSLOduration=1.148123684 podStartE2EDuration="4.110397047s" podCreationTimestamp="2026-04-24 21:23:32 +0000 UTC" firstStartedPulling="2026-04-24 21:23:32.894562214 +0000 UTC m=+436.862364254" lastFinishedPulling="2026-04-24 21:23:35.856835564 +0000 UTC m=+439.824637617" observedRunningTime="2026-04-24 21:23:36.10724306 +0000 UTC m=+440.075045122" watchObservedRunningTime="2026-04-24 21:23:36.110397047 +0000 UTC m=+440.078199113" Apr 24 21:23:36.129020 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:23:36.128981 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" podStartSLOduration=1.129614546 podStartE2EDuration="4.128970748s" podCreationTimestamp="2026-04-24 21:23:32 +0000 UTC" firstStartedPulling="2026-04-24 21:23:32.880467852 +0000 UTC m=+436.848269893" lastFinishedPulling="2026-04-24 21:23:35.879824045 +0000 UTC m=+439.847626095" observedRunningTime="2026-04-24 21:23:36.12782443 +0000 UTC m=+440.095626491" watchObservedRunningTime="2026-04-24 21:23:36.128970748 +0000 UTC m=+440.096772809" Apr 24 21:24:07.097492 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:07.097461 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rq55d" Apr 24 21:24:07.100332 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:07.100312 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:24:08.391459 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.391427 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-lgpwz"] Apr 24 21:24:08.391857 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.391610 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" podUID="10d5fc69-d500-40da-ab21-9252bde96100" containerName="manager" containerID="cri-o://249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d" gracePeriod=10 Apr 24 21:24:08.414136 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.414114 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-wh42d"] Apr 24 21:24:08.417240 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.417224 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:08.429080 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.429059 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-wh42d"] Apr 24 21:24:08.442874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.442854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ad0017-0c4c-4b7f-b789-c3f268e9a34b-cert\") pod \"kserve-controller-manager-74fc8f6f96-wh42d\" (UID: \"26ad0017-0c4c-4b7f-b789-c3f268e9a34b\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:08.442966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.442894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxtlg\" (UniqueName: \"kubernetes.io/projected/26ad0017-0c4c-4b7f-b789-c3f268e9a34b-kube-api-access-mxtlg\") pod \"kserve-controller-manager-74fc8f6f96-wh42d\" (UID: \"26ad0017-0c4c-4b7f-b789-c3f268e9a34b\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:08.543814 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.543774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtlg\" (UniqueName: \"kubernetes.io/projected/26ad0017-0c4c-4b7f-b789-c3f268e9a34b-kube-api-access-mxtlg\") pod \"kserve-controller-manager-74fc8f6f96-wh42d\" (UID: \"26ad0017-0c4c-4b7f-b789-c3f268e9a34b\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:08.543980 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.543874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ad0017-0c4c-4b7f-b789-c3f268e9a34b-cert\") pod \"kserve-controller-manager-74fc8f6f96-wh42d\" (UID: \"26ad0017-0c4c-4b7f-b789-c3f268e9a34b\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:08.546194 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.546171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ad0017-0c4c-4b7f-b789-c3f268e9a34b-cert\") pod \"kserve-controller-manager-74fc8f6f96-wh42d\" (UID: \"26ad0017-0c4c-4b7f-b789-c3f268e9a34b\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:08.556472 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.556424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxtlg\" (UniqueName: \"kubernetes.io/projected/26ad0017-0c4c-4b7f-b789-c3f268e9a34b-kube-api-access-mxtlg\") pod \"kserve-controller-manager-74fc8f6f96-wh42d\" (UID: \"26ad0017-0c4c-4b7f-b789-c3f268e9a34b\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:08.624505 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.624485 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:24:08.644399 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.644344 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsnd4\" (UniqueName: \"kubernetes.io/projected/10d5fc69-d500-40da-ab21-9252bde96100-kube-api-access-nsnd4\") pod \"10d5fc69-d500-40da-ab21-9252bde96100\" (UID: \"10d5fc69-d500-40da-ab21-9252bde96100\") " Apr 24 21:24:08.644399 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.644396 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10d5fc69-d500-40da-ab21-9252bde96100-cert\") pod \"10d5fc69-d500-40da-ab21-9252bde96100\" (UID: \"10d5fc69-d500-40da-ab21-9252bde96100\") " Apr 24 21:24:08.646855 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.646827 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d5fc69-d500-40da-ab21-9252bde96100-cert" (OuterVolumeSpecName: "cert") pod "10d5fc69-d500-40da-ab21-9252bde96100" (UID: "10d5fc69-d500-40da-ab21-9252bde96100"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:24:08.647050 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.647025 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d5fc69-d500-40da-ab21-9252bde96100-kube-api-access-nsnd4" (OuterVolumeSpecName: "kube-api-access-nsnd4") pod "10d5fc69-d500-40da-ab21-9252bde96100" (UID: "10d5fc69-d500-40da-ab21-9252bde96100"). InnerVolumeSpecName "kube-api-access-nsnd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:24:08.745595 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.745565 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10d5fc69-d500-40da-ab21-9252bde96100-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:24:08.745595 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.745590 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsnd4\" (UniqueName: \"kubernetes.io/projected/10d5fc69-d500-40da-ab21-9252bde96100-kube-api-access-nsnd4\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:24:08.775035 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.775000 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:08.895800 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:08.895777 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-wh42d"] Apr 24 21:24:08.898375 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:24:08.898348 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ad0017_0c4c_4b7f_b789_c3f268e9a34b.slice/crio-2ed2ff1464d545458ea9d4433a39365c4ab0abe978ec07fae0c4f1666f5afca6 WatchSource:0}: Error finding container 2ed2ff1464d545458ea9d4433a39365c4ab0abe978ec07fae0c4f1666f5afca6: Status 404 returned error can't find the container with id 2ed2ff1464d545458ea9d4433a39365c4ab0abe978ec07fae0c4f1666f5afca6 Apr 24 21:24:09.184938 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.184901 2578 generic.go:358] "Generic (PLEG): container finished" podID="10d5fc69-d500-40da-ab21-9252bde96100" containerID="249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d" exitCode=0 Apr 24 21:24:09.185083 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.184971 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" Apr 24 21:24:09.185083 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.184988 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" event={"ID":"10d5fc69-d500-40da-ab21-9252bde96100","Type":"ContainerDied","Data":"249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d"} Apr 24 21:24:09.185083 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.185033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-lgpwz" event={"ID":"10d5fc69-d500-40da-ab21-9252bde96100","Type":"ContainerDied","Data":"312effae72dd3b396d829329cac8e43c08736f84365a404bbdb5ec3584939b2f"} Apr 24 21:24:09.185083 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.185052 2578 scope.go:117] "RemoveContainer" containerID="249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d" Apr 24 21:24:09.185973 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.185951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" event={"ID":"26ad0017-0c4c-4b7f-b789-c3f268e9a34b","Type":"ContainerStarted","Data":"2ed2ff1464d545458ea9d4433a39365c4ab0abe978ec07fae0c4f1666f5afca6"} Apr 24 21:24:09.192301 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.192285 2578 scope.go:117] "RemoveContainer" containerID="249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d" Apr 24 21:24:09.192557 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:24:09.192520 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d\": container with ID starting with 249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d not found: ID does not exist" containerID="249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d" Apr 24 21:24:09.192610 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.192559 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d"} err="failed to get container status \"249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d\": rpc error: code = NotFound desc = could not find container \"249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d\": container with ID starting with 249b42527272839b7062501156899512fec6474788bdcadc69478c84444c807d not found: ID does not exist" Apr 24 21:24:09.201828 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.201805 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-lgpwz"] Apr 24 21:24:09.204932 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:09.204914 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-lgpwz"] Apr 24 21:24:10.190239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:10.190203 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" event={"ID":"26ad0017-0c4c-4b7f-b789-c3f268e9a34b","Type":"ContainerStarted","Data":"6493010fbf1741856e1a59167dcc6ee2a1801c66bb42930a023e4971c5bb3a6a"} Apr 24 21:24:10.190604 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:10.190343 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:10.208012 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:10.207966 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" podStartSLOduration=1.570062655 podStartE2EDuration="2.207953318s" podCreationTimestamp="2026-04-24 21:24:08 +0000 UTC" firstStartedPulling="2026-04-24 21:24:08.899568769 +0000 UTC m=+472.867370809" lastFinishedPulling="2026-04-24 21:24:09.537459428 +0000 UTC m=+473.505261472" observedRunningTime="2026-04-24 21:24:10.20592887 +0000 UTC m=+474.173730931" watchObservedRunningTime="2026-04-24 21:24:10.207953318 +0000 UTC m=+474.175755376" Apr 24 21:24:10.659589 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:10.659563 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d5fc69-d500-40da-ab21-9252bde96100" path="/var/lib/kubelet/pods/10d5fc69-d500-40da-ab21-9252bde96100/volumes" Apr 24 21:24:41.197470 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:41.197439 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-74fc8f6f96-wh42d" Apr 24 21:24:48.361863 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.361827 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67cd8c69c5-42vtl"] Apr 24 21:24:48.362318 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.362297 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10d5fc69-d500-40da-ab21-9252bde96100" containerName="manager" Apr 24 21:24:48.362318 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.362315 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d5fc69-d500-40da-ab21-9252bde96100" containerName="manager" Apr 24 21:24:48.362440 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.362413 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="10d5fc69-d500-40da-ab21-9252bde96100" containerName="manager" Apr 24 21:24:48.365336 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.365314 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.379914 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.379889 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67cd8c69c5-42vtl"] Apr 24 21:24:48.426867 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.426841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-console-config\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.426954 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.426872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-oauth-serving-cert\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.427006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.426949 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3a2bd38-885a-4011-aafa-732484642c2d-console-oauth-config\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.427006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.426967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-trusted-ca-bundle\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.427073 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.427013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-service-ca\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.427073 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.427032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3a2bd38-885a-4011-aafa-732484642c2d-console-serving-cert\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.427073 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.427053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9f6v\" (UniqueName: \"kubernetes.io/projected/c3a2bd38-885a-4011-aafa-732484642c2d-kube-api-access-c9f6v\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.527496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.527461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3a2bd38-885a-4011-aafa-732484642c2d-console-oauth-config\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.527496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.527499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-trusted-ca-bundle\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.527659 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.527526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-service-ca\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.527659 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.527543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3a2bd38-885a-4011-aafa-732484642c2d-console-serving-cert\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.527659 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.527561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9f6v\" (UniqueName: \"kubernetes.io/projected/c3a2bd38-885a-4011-aafa-732484642c2d-kube-api-access-c9f6v\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.527659 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.527595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-console-config\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.527659 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.527620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-oauth-serving-cert\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.528263 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.528236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-service-ca\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.528349 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.528286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-oauth-serving-cert\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.528397 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.528381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-console-config\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.528608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.528588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3a2bd38-885a-4011-aafa-732484642c2d-trusted-ca-bundle\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.530037 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.530010 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3a2bd38-885a-4011-aafa-732484642c2d-console-oauth-config\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.530115 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.530098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3a2bd38-885a-4011-aafa-732484642c2d-console-serving-cert\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.536940 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.536918 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9f6v\" (UniqueName: \"kubernetes.io/projected/c3a2bd38-885a-4011-aafa-732484642c2d-kube-api-access-c9f6v\") pod \"console-67cd8c69c5-42vtl\" (UID: \"c3a2bd38-885a-4011-aafa-732484642c2d\") " pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.673856 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.673826 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:48.795170 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:48.795118 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67cd8c69c5-42vtl"] Apr 24 21:24:48.797501 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:24:48.797470 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3a2bd38_885a_4011_aafa_732484642c2d.slice/crio-5e671ac39cea2c8242149af330892e84aaddabb2a1793cbc3ee42af341132447 WatchSource:0}: Error finding container 5e671ac39cea2c8242149af330892e84aaddabb2a1793cbc3ee42af341132447: Status 404 returned error can't find the container with id 5e671ac39cea2c8242149af330892e84aaddabb2a1793cbc3ee42af341132447 Apr 24 21:24:49.299263 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:49.299224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cd8c69c5-42vtl" event={"ID":"c3a2bd38-885a-4011-aafa-732484642c2d","Type":"ContainerStarted","Data":"36e018ee1dcdde85e33d46e879a34264f14662af4d8065918e92914c684a45b3"} Apr 24 21:24:49.299410 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:49.299267 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cd8c69c5-42vtl" event={"ID":"c3a2bd38-885a-4011-aafa-732484642c2d","Type":"ContainerStarted","Data":"5e671ac39cea2c8242149af330892e84aaddabb2a1793cbc3ee42af341132447"} Apr 24 21:24:49.336205 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:49.336164 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67cd8c69c5-42vtl" podStartSLOduration=1.336149451 podStartE2EDuration="1.336149451s" podCreationTimestamp="2026-04-24 21:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:24:49.335831471 +0000 UTC m=+513.303633531" watchObservedRunningTime="2026-04-24 21:24:49.336149451 +0000 UTC m=+513.303951512" Apr 24 21:24:58.674155 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:58.674081 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:58.674155 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:58.674124 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:58.678625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:58.678602 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:59.331781 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:59.331743 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67cd8c69c5-42vtl" Apr 24 21:24:59.380075 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:24:59.380043 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-99c44456d-jq9fv"] Apr 24 21:25:24.402361 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.402322 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-99c44456d-jq9fv" podUID="4982c561-07c4-461e-a7c8-9f4e7b7406e6" containerName="console" containerID="cri-o://148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a" gracePeriod=15 Apr 24 21:25:24.632820 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.632794 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-99c44456d-jq9fv_4982c561-07c4-461e-a7c8-9f4e7b7406e6/console/0.log" Apr 24 21:25:24.632915 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.632855 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:25:24.802250 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802192 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd49t\" (UniqueName: \"kubernetes.io/projected/4982c561-07c4-461e-a7c8-9f4e7b7406e6-kube-api-access-jd49t\") pod \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " Apr 24 21:25:24.802250 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802223 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-serving-cert\") pod \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " Apr 24 21:25:24.802250 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802244 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-oauth-config\") pod \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " Apr 24 21:25:24.802493 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802265 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-trusted-ca-bundle\") pod \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " Apr 24 21:25:24.802493 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802288 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-service-ca\") pod \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " Apr 24 21:25:24.802493 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802326 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-config\") pod \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " Apr 24 21:25:24.802493 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802409 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-oauth-serving-cert\") pod \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\" (UID: \"4982c561-07c4-461e-a7c8-9f4e7b7406e6\") " Apr 24 21:25:24.802846 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802813 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-config" (OuterVolumeSpecName: "console-config") pod "4982c561-07c4-461e-a7c8-9f4e7b7406e6" (UID: "4982c561-07c4-461e-a7c8-9f4e7b7406e6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:25:24.802846 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802838 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4982c561-07c4-461e-a7c8-9f4e7b7406e6" (UID: "4982c561-07c4-461e-a7c8-9f4e7b7406e6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:25:24.803007 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802839 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4982c561-07c4-461e-a7c8-9f4e7b7406e6" (UID: "4982c561-07c4-461e-a7c8-9f4e7b7406e6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:25:24.803007 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.802820 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-service-ca" (OuterVolumeSpecName: "service-ca") pod "4982c561-07c4-461e-a7c8-9f4e7b7406e6" (UID: "4982c561-07c4-461e-a7c8-9f4e7b7406e6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:25:24.804491 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.804460 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4982c561-07c4-461e-a7c8-9f4e7b7406e6" (UID: "4982c561-07c4-461e-a7c8-9f4e7b7406e6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:25:24.804595 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.804549 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4982c561-07c4-461e-a7c8-9f4e7b7406e6" (UID: "4982c561-07c4-461e-a7c8-9f4e7b7406e6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:25:24.804637 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.804613 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4982c561-07c4-461e-a7c8-9f4e7b7406e6-kube-api-access-jd49t" (OuterVolumeSpecName: "kube-api-access-jd49t") pod "4982c561-07c4-461e-a7c8-9f4e7b7406e6" (UID: "4982c561-07c4-461e-a7c8-9f4e7b7406e6"). InnerVolumeSpecName "kube-api-access-jd49t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:25:24.902915 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.902888 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-serving-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:25:24.902915 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.902912 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-oauth-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:25:24.903054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.902926 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-trusted-ca-bundle\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:25:24.903054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.902939 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-service-ca\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:25:24.903054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.902952 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-console-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:25:24.903054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.902965 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4982c561-07c4-461e-a7c8-9f4e7b7406e6-oauth-serving-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:25:24.903054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:24.902977 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jd49t\" (UniqueName: \"kubernetes.io/projected/4982c561-07c4-461e-a7c8-9f4e7b7406e6-kube-api-access-jd49t\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:25:25.405216 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.405187 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-99c44456d-jq9fv_4982c561-07c4-461e-a7c8-9f4e7b7406e6/console/0.log" Apr 24 21:25:25.405634 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.405224 2578 generic.go:358] "Generic (PLEG): container finished" podID="4982c561-07c4-461e-a7c8-9f4e7b7406e6" containerID="148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a" exitCode=2 Apr 24 21:25:25.405634 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.405250 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99c44456d-jq9fv" event={"ID":"4982c561-07c4-461e-a7c8-9f4e7b7406e6","Type":"ContainerDied","Data":"148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a"} Apr 24 21:25:25.405634 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.405271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99c44456d-jq9fv" event={"ID":"4982c561-07c4-461e-a7c8-9f4e7b7406e6","Type":"ContainerDied","Data":"0cf8a260678c50983bf464b61eb94bbec059f4f2843cf0e66e093a87a45b7836"} Apr 24 21:25:25.405634 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.405285 2578 scope.go:117] "RemoveContainer" containerID="148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a" Apr 24 21:25:25.405634 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.405287 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99c44456d-jq9fv" Apr 24 21:25:25.413499 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.413482 2578 scope.go:117] "RemoveContainer" containerID="148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a" Apr 24 21:25:25.413772 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:25:25.413733 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a\": container with ID starting with 148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a not found: ID does not exist" containerID="148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a" Apr 24 21:25:25.413809 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.413784 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a"} err="failed to get container status \"148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a\": rpc error: code = NotFound desc = could not find container \"148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a\": container with ID starting with 148705aaa90f9e90b2d1c035c6c373a11d950019dcf4603fbca9a15f49abc71a not found: ID does not exist" Apr 24 21:25:25.426275 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.426250 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-99c44456d-jq9fv"] Apr 24 21:25:25.429564 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:25.429543 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-99c44456d-jq9fv"] Apr 24 21:25:26.660439 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:26.660408 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4982c561-07c4-461e-a7c8-9f4e7b7406e6" path="/var/lib/kubelet/pods/4982c561-07c4-461e-a7c8-9f4e7b7406e6/volumes" Apr 24 21:25:42.578144 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.578106 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v"] Apr 24 21:25:42.578514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.578498 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4982c561-07c4-461e-a7c8-9f4e7b7406e6" containerName="console" Apr 24 21:25:42.578558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.578515 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4982c561-07c4-461e-a7c8-9f4e7b7406e6" containerName="console" Apr 24 21:25:42.578595 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.578584 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4982c561-07c4-461e-a7c8-9f4e7b7406e6" containerName="console" Apr 24 21:25:42.582048 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.582029 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.584235 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.584211 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:25:42.584467 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.584446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:25:42.584467 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.584457 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:25:42.584631 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.584504 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 24 21:25:42.584631 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.584508 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 24 21:25:42.591299 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.591277 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v"] Apr 24 21:25:42.622222 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.622194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x7w2\" (UniqueName: \"kubernetes.io/projected/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kube-api-access-6x7w2\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.622331 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.622249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.622331 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.622319 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.622449 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.622359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a89ddc5a-cd7d-47f8-830e-477931a4cd18-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.723021 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.722994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.723145 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.723028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.723145 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.723045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a89ddc5a-cd7d-47f8-830e-477931a4cd18-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.723145 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.723083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6x7w2\" (UniqueName: \"kubernetes.io/projected/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kube-api-access-6x7w2\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.723145 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:25:42.723132 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-serving-cert: secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 24 21:25:42.723383 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:25:42.723203 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls podName:a89ddc5a-cd7d-47f8-830e-477931a4cd18 nodeName:}" failed. No retries permitted until 2026-04-24 21:25:43.223184658 +0000 UTC m=+567.190986703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls") pod "isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" (UID: "a89ddc5a-cd7d-47f8-830e-477931a4cd18") : secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 24 21:25:42.723450 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.723393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.723625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.723605 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a89ddc5a-cd7d-47f8-830e-477931a4cd18-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:42.733534 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:42.733512 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x7w2\" (UniqueName: \"kubernetes.io/projected/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kube-api-access-6x7w2\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:43.226150 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:43.226123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:43.228479 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:43.228462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:43.494239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:43.494158 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:25:43.611449 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:43.611426 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v"] Apr 24 21:25:43.614104 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:25:43.614060 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89ddc5a_cd7d_47f8_830e_477931a4cd18.slice/crio-76139c3335f6ac40f753eed2e61028c9494255212eb11ad1c860d1af2a0e00ae WatchSource:0}: Error finding container 76139c3335f6ac40f753eed2e61028c9494255212eb11ad1c860d1af2a0e00ae: Status 404 returned error can't find the container with id 76139c3335f6ac40f753eed2e61028c9494255212eb11ad1c860d1af2a0e00ae Apr 24 21:25:44.466308 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:44.466240 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerStarted","Data":"76139c3335f6ac40f753eed2e61028c9494255212eb11ad1c860d1af2a0e00ae"} Apr 24 21:25:48.479496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:48.479457 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerStarted","Data":"66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302"} Apr 24 21:25:51.488712 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:51.488644 2578 generic.go:358] "Generic (PLEG): container finished" podID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerID="66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302" exitCode=0 Apr 24 21:25:51.489014 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:25:51.488719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerDied","Data":"66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302"} Apr 24 21:26:04.541995 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:04.541954 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerStarted","Data":"5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999"} Apr 24 21:26:06.550315 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:06.550278 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerStarted","Data":"5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621"} Apr 24 21:26:09.563108 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:09.563075 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerStarted","Data":"18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229"} Apr 24 21:26:09.563550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:09.563281 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:26:09.563550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:09.563409 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:26:09.564515 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:09.564496 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:26:09.581465 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:09.581338 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podStartSLOduration=2.322875541 podStartE2EDuration="27.581327191s" podCreationTimestamp="2026-04-24 21:25:42 +0000 UTC" firstStartedPulling="2026-04-24 21:25:43.616092286 +0000 UTC m=+567.583894327" lastFinishedPulling="2026-04-24 21:26:08.874543933 +0000 UTC m=+592.842345977" observedRunningTime="2026-04-24 21:26:09.5810486 +0000 UTC m=+593.548850660" watchObservedRunningTime="2026-04-24 21:26:09.581327191 +0000 UTC m=+593.549129253" Apr 24 21:26:10.566029 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:10.565990 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:26:10.566480 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:10.566151 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:26:10.567054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:10.567030 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:26:11.568471 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:11.568427 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:26:11.568973 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:11.568948 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:26:11.571873 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:11.571855 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:26:12.576977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:12.576929 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:26:12.577389 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:12.577129 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:26:16.562255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:16.562221 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:26:16.562650 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:16.562484 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:26:22.576993 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:22.576952 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:26:22.577432 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:22.577133 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:26:32.577198 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:32.577152 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:26:32.682086 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:32.577613 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:26:42.577724 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:42.577680 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:26:42.578197 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:42.578061 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:26:52.577874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:52.577829 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:26:52.578298 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:26:52.578273 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:02.577696 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:02.577645 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:27:02.578136 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:02.578114 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:12.577915 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:12.577887 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:27:12.578298 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:12.578201 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:27:27.619170 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.619136 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v"] Apr 24 21:27:27.619671 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.619614 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" containerID="cri-o://5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999" gracePeriod=30 Apr 24 21:27:27.619671 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.619645 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" containerID="cri-o://18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229" gracePeriod=30 Apr 24 21:27:27.619819 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.619655 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" containerID="cri-o://5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621" gracePeriod=30 Apr 24 21:27:27.772248 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.772220 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj"] Apr 24 21:27:27.775829 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.775810 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.778520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.778502 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 24 21:27:27.778520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.778513 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 24 21:27:27.789005 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.788983 2578 generic.go:358] "Generic (PLEG): container finished" podID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerID="5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621" exitCode=2 Apr 24 21:27:27.789193 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.789081 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerDied","Data":"5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621"} Apr 24 21:27:27.790612 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.790592 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj"] Apr 24 21:27:27.825581 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.825561 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d0f86f7-103c-47ac-81ba-2cafe28b7661-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.825681 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.825590 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7tz\" (UniqueName: \"kubernetes.io/projected/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kube-api-access-gd7tz\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.825681 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.825620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d0f86f7-103c-47ac-81ba-2cafe28b7661-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.825790 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.825684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.926764 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.926728 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d0f86f7-103c-47ac-81ba-2cafe28b7661-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.926864 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.926778 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7tz\" (UniqueName: \"kubernetes.io/projected/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kube-api-access-gd7tz\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.926864 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.926839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d0f86f7-103c-47ac-81ba-2cafe28b7661-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.926968 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.926871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.927287 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.927255 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.927825 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.927802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d0f86f7-103c-47ac-81ba-2cafe28b7661-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.929487 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.929468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d0f86f7-103c-47ac-81ba-2cafe28b7661-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:27.935795 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:27.935773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7tz\" (UniqueName: \"kubernetes.io/projected/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kube-api-access-gd7tz\") pod \"isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:28.085829 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:28.085803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:28.211561 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:28.211487 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj"] Apr 24 21:27:28.214895 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:27:28.214867 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d0f86f7_103c_47ac_81ba_2cafe28b7661.slice/crio-55e1dd58195df08fba13bfbe818c80bae9c91b48fb63d02b1eca932a5326819e WatchSource:0}: Error finding container 55e1dd58195df08fba13bfbe818c80bae9c91b48fb63d02b1eca932a5326819e: Status 404 returned error can't find the container with id 55e1dd58195df08fba13bfbe818c80bae9c91b48fb63d02b1eca932a5326819e Apr 24 21:27:28.216678 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:28.216661 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:28.793284 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:28.793249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerStarted","Data":"6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6"} Apr 24 21:27:28.793641 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:28.793289 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerStarted","Data":"55e1dd58195df08fba13bfbe818c80bae9c91b48fb63d02b1eca932a5326819e"} Apr 24 21:27:31.568958 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:31.568926 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 21:27:31.802316 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:31.802284 2578 generic.go:358] "Generic (PLEG): container finished" podID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerID="5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999" exitCode=0 Apr 24 21:27:31.802440 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:31.802356 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerDied","Data":"5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999"} Apr 24 21:27:32.577591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:32.577505 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:27:32.578002 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:32.577837 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:32.806279 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:32.806250 2578 generic.go:358] "Generic (PLEG): container finished" podID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerID="6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6" exitCode=0 Apr 24 21:27:32.806444 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:32.806317 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerDied","Data":"6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6"} Apr 24 21:27:33.811123 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:33.811087 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerStarted","Data":"7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0"} Apr 24 21:27:33.811123 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:33.811127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerStarted","Data":"72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0"} Apr 24 21:27:33.811591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:33.811138 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerStarted","Data":"c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d"} Apr 24 21:27:33.811591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:33.811376 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:33.811591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:33.811507 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:33.812818 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:33.812792 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:27:33.832445 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:33.832398 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podStartSLOduration=6.832385083 podStartE2EDuration="6.832385083s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:33.830716653 +0000 UTC m=+677.798518714" watchObservedRunningTime="2026-04-24 21:27:33.832385083 +0000 UTC m=+677.800187198" Apr 24 21:27:34.814044 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:34.814010 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:34.814511 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:34.814037 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:27:34.815066 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:34.815039 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:35.816805 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:35.816736 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:27:35.817154 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:35.817093 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:36.568953 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:36.568909 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 21:27:40.820669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:40.820638 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:27:40.821269 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:40.821243 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:27:40.821581 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:40.821559 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:41.568875 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:41.568833 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 21:27:41.569039 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:41.568949 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:27:42.576864 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:42.576824 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:27:42.577291 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:42.577076 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:46.568621 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:46.568581 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 21:27:50.821185 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:50.821136 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:27:50.821604 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:50.821529 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:51.568678 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:51.568636 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 21:27:52.577837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:52.577794 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:27:52.578260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:52.577938 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:27:52.578260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:52.578037 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:52.578260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:52.578151 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:27:56.569031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:56.568943 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 21:27:57.769338 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.769311 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:27:57.852585 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.852553 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a89ddc5a-cd7d-47f8-830e-477931a4cd18-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " Apr 24 21:27:57.852721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.852660 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x7w2\" (UniqueName: \"kubernetes.io/projected/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kube-api-access-6x7w2\") pod \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " Apr 24 21:27:57.852721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.852692 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kserve-provision-location\") pod \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " Apr 24 21:27:57.852886 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.852864 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls\") pod \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\" (UID: \"a89ddc5a-cd7d-47f8-830e-477931a4cd18\") " Apr 24 21:27:57.853012 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.852889 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a89ddc5a-cd7d-47f8-830e-477931a4cd18-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "a89ddc5a-cd7d-47f8-830e-477931a4cd18" (UID: "a89ddc5a-cd7d-47f8-830e-477931a4cd18"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:27:57.853074 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.853025 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a89ddc5a-cd7d-47f8-830e-477931a4cd18" (UID: "a89ddc5a-cd7d-47f8-830e-477931a4cd18"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:27:57.853172 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.853145 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:27:57.853172 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.853171 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a89ddc5a-cd7d-47f8-830e-477931a4cd18-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:27:57.854978 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.854959 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a89ddc5a-cd7d-47f8-830e-477931a4cd18" (UID: "a89ddc5a-cd7d-47f8-830e-477931a4cd18"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:27:57.855055 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.854985 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kube-api-access-6x7w2" (OuterVolumeSpecName: "kube-api-access-6x7w2") pod "a89ddc5a-cd7d-47f8-830e-477931a4cd18" (UID: "a89ddc5a-cd7d-47f8-830e-477931a4cd18"). InnerVolumeSpecName "kube-api-access-6x7w2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:27:57.884857 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.884807 2578 generic.go:358] "Generic (PLEG): container finished" podID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerID="18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229" exitCode=0 Apr 24 21:27:57.884944 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.884891 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" Apr 24 21:27:57.884944 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.884892 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerDied","Data":"18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229"} Apr 24 21:27:57.884944 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.884936 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v" event={"ID":"a89ddc5a-cd7d-47f8-830e-477931a4cd18","Type":"ContainerDied","Data":"76139c3335f6ac40f753eed2e61028c9494255212eb11ad1c860d1af2a0e00ae"} Apr 24 21:27:57.885041 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.884952 2578 scope.go:117] "RemoveContainer" containerID="18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229" Apr 24 21:27:57.892850 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.892836 2578 scope.go:117] "RemoveContainer" containerID="5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621" Apr 24 21:27:57.900215 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.900195 2578 scope.go:117] "RemoveContainer" containerID="5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999" Apr 24 21:27:57.905932 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.905911 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v"] Apr 24 21:27:57.907356 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.907340 2578 scope.go:117] "RemoveContainer" containerID="66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302" Apr 24 21:27:57.912381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.912354 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6995d6bbb4-lj24v"] Apr 24 21:27:57.914523 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.914508 2578 scope.go:117] "RemoveContainer" containerID="18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229" Apr 24 21:27:57.914874 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:27:57.914848 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229\": container with ID starting with 18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229 not found: ID does not exist" containerID="18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229" Apr 24 21:27:57.914938 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.914883 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229"} err="failed to get container status \"18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229\": rpc error: code = NotFound desc = could not find container \"18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229\": container with ID starting with 18ebfee717aabb721b46470a93167c8750f2f662b8d0cbea032b1e04e6f3e229 not found: ID does not exist" Apr 24 21:27:57.914938 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.914900 2578 scope.go:117] "RemoveContainer" containerID="5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621" Apr 24 21:27:57.915115 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:27:57.915097 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621\": container with ID starting with 5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621 not found: ID does not exist" containerID="5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621" Apr 24 21:27:57.915156 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.915120 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621"} err="failed to get container status \"5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621\": rpc error: code = NotFound desc = could not find container \"5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621\": container with ID starting with 5c3e0d1aeffa1f4b2bb6944c72091280dd6a16672237796af3586fab23337621 not found: ID does not exist" Apr 24 21:27:57.915156 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.915134 2578 scope.go:117] "RemoveContainer" containerID="5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999" Apr 24 21:27:57.915311 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:27:57.915296 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999\": container with ID starting with 5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999 not found: ID does not exist" containerID="5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999" Apr 24 21:27:57.915350 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.915314 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999"} err="failed to get container status \"5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999\": rpc error: code = NotFound desc = could not find container \"5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999\": container with ID starting with 5d8e36248fb69a43074107dde85544c78077be10934e396e2c5e091840d9c999 not found: ID does not exist" Apr 24 21:27:57.915350 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.915326 2578 scope.go:117] "RemoveContainer" containerID="66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302" Apr 24 21:27:57.915498 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:27:57.915482 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302\": container with ID starting with 66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302 not found: ID does not exist" containerID="66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302" Apr 24 21:27:57.915537 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.915501 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302"} err="failed to get container status \"66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302\": rpc error: code = NotFound desc = could not find container \"66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302\": container with ID starting with 66021c524c695119497a0cab05759cff11b785d11b9710d7db4a615ebe739302 not found: ID does not exist" Apr 24 21:27:57.953533 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.953510 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6x7w2\" (UniqueName: \"kubernetes.io/projected/a89ddc5a-cd7d-47f8-830e-477931a4cd18-kube-api-access-6x7w2\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:27:57.953533 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:57.953530 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a89ddc5a-cd7d-47f8-830e-477931a4cd18-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:27:58.660567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:27:58.660526 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" path="/var/lib/kubelet/pods/a89ddc5a-cd7d-47f8-830e-477931a4cd18/volumes" Apr 24 21:28:00.821548 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:00.821512 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:28:00.822019 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:00.821998 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:10.821145 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:10.821105 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:28:10.821607 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:10.821583 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:20.821874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:20.821827 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:28:20.822312 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:20.822280 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:30.821887 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:30.821840 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:28:30.822361 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:30.822253 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:40.821629 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:40.821601 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:28:40.822007 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:40.821840 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:28:52.745862 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.745827 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj"] Apr 24 21:28:52.746259 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.746196 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" containerID="cri-o://c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d" gracePeriod=30 Apr 24 21:28:52.746259 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.746223 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" containerID="cri-o://7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0" gracePeriod=30 Apr 24 21:28:52.746374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.746271 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" containerID="cri-o://72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0" gracePeriod=30 Apr 24 21:28:52.819883 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.819839 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt"] Apr 24 21:28:52.820182 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820170 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" Apr 24 21:28:52.820224 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820183 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" Apr 24 21:28:52.820224 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820199 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" Apr 24 21:28:52.820224 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820204 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" Apr 24 21:28:52.820224 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820215 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" Apr 24 21:28:52.820224 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820220 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" Apr 24 21:28:52.820366 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820245 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="storage-initializer" Apr 24 21:28:52.820366 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820251 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="storage-initializer" Apr 24 21:28:52.820366 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820303 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kube-rbac-proxy" Apr 24 21:28:52.820366 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820312 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="kserve-container" Apr 24 21:28:52.820366 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.820320 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a89ddc5a-cd7d-47f8-830e-477931a4cd18" containerName="agent" Apr 24 21:28:52.823448 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.823428 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:52.825740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.825719 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 24 21:28:52.825838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.825740 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 24 21:28:52.833325 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.833296 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt"] Apr 24 21:28:52.846946 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.846919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26f6v\" (UniqueName: \"kubernetes.io/projected/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-kube-api-access-26f6v\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:52.847029 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.846991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:52.847029 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.847009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:52.948158 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.948132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:52.948276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.948164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:52.948276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.948197 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26f6v\" (UniqueName: \"kubernetes.io/projected/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-kube-api-access-26f6v\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:52.948366 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:28:52.948340 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 24 21:28:52.948407 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:28:52.948396 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls podName:1a8b3832-4b60-40a3-8ac3-4340624f9a3c nodeName:}" failed. No retries permitted until 2026-04-24 21:28:53.448379828 +0000 UTC m=+757.416181873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-npgqt" (UID: "1a8b3832-4b60-40a3-8ac3-4340624f9a3c") : secret "message-dumper-predictor-serving-cert" not found Apr 24 21:28:52.948739 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.948719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:52.959712 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:52.959693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26f6v\" (UniqueName: \"kubernetes.io/projected/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-kube-api-access-26f6v\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:53.043438 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:53.043374 2578 generic.go:358] "Generic (PLEG): container finished" podID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerID="72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0" exitCode=2 Apr 24 21:28:53.043438 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:53.043425 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerDied","Data":"72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0"} Apr 24 21:28:53.452014 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:53.451982 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:53.454516 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:53.454493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-npgqt\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:53.734144 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:53.734063 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:53.853427 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:53.853400 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt"] Apr 24 21:28:53.856021 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:28:53.855989 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a8b3832_4b60_40a3_8ac3_4340624f9a3c.slice/crio-8ed93ed288b54ac867c1784b7250509ef9aac7e2f5cbfbfc8095ca693199eae2 WatchSource:0}: Error finding container 8ed93ed288b54ac867c1784b7250509ef9aac7e2f5cbfbfc8095ca693199eae2: Status 404 returned error can't find the container with id 8ed93ed288b54ac867c1784b7250509ef9aac7e2f5cbfbfc8095ca693199eae2 Apr 24 21:28:54.047227 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:54.047154 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" event={"ID":"1a8b3832-4b60-40a3-8ac3-4340624f9a3c","Type":"ContainerStarted","Data":"8ed93ed288b54ac867c1784b7250509ef9aac7e2f5cbfbfc8095ca693199eae2"} Apr 24 21:28:55.052200 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:55.052174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" event={"ID":"1a8b3832-4b60-40a3-8ac3-4340624f9a3c","Type":"ContainerStarted","Data":"3c8b07915beaaeb4b08e212e850f485abdf8cd93a290fc942701b610a6cef627"} Apr 24 21:28:55.052529 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:55.052208 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" event={"ID":"1a8b3832-4b60-40a3-8ac3-4340624f9a3c","Type":"ContainerStarted","Data":"ba73df103d8dedf99d909526e1b575e966d3751ea47094f583eb8a1d601663b7"} Apr 24 21:28:55.052529 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:55.052335 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:55.070869 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:55.070828 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" podStartSLOduration=2.021314359 podStartE2EDuration="3.070816294s" podCreationTimestamp="2026-04-24 21:28:52 +0000 UTC" firstStartedPulling="2026-04-24 21:28:53.85783797 +0000 UTC m=+757.825640024" lastFinishedPulling="2026-04-24 21:28:54.907339919 +0000 UTC m=+758.875141959" observedRunningTime="2026-04-24 21:28:55.068820611 +0000 UTC m=+759.036622672" watchObservedRunningTime="2026-04-24 21:28:55.070816294 +0000 UTC m=+759.038618413" Apr 24 21:28:55.817107 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:55.817070 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 21:28:56.054926 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:56.054894 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:56.056518 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:56.056496 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:28:57.059829 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:57.059797 2578 generic.go:358] "Generic (PLEG): container finished" podID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerID="c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d" exitCode=0 Apr 24 21:28:57.060203 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:28:57.059858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerDied","Data":"c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d"} Apr 24 21:29:00.817173 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:00.817129 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 21:29:00.821404 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:00.821372 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:29:00.821677 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:00.821654 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:03.067846 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:03.067815 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:29:05.817537 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:05.817502 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 21:29:05.817897 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:05.817612 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:29:10.817626 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:10.817593 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 21:29:10.822075 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:10.822037 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:29:10.822343 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:10.822319 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:12.856500 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:12.856469 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z"] Apr 24 21:29:12.860710 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:12.860682 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:12.862846 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:12.862826 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 24 21:29:12.863157 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:12.863137 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 24 21:29:12.873016 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:12.872988 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z"] Apr 24 21:29:13.007468 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.007444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5t9\" (UniqueName: \"kubernetes.io/projected/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kube-api-access-qs5t9\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.007596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.007492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.007596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.007541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kserve-provision-location\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.007596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.007587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-proxy-tls\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.108474 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.108408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs5t9\" (UniqueName: \"kubernetes.io/projected/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kube-api-access-qs5t9\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.108590 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.108489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.108590 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.108534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kserve-provision-location\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.108590 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.108560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-proxy-tls\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.108911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.108892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kserve-provision-location\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.109179 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.109158 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.111247 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.111226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-proxy-tls\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.116244 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.116227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs5t9\" (UniqueName: \"kubernetes.io/projected/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kube-api-access-qs5t9\") pod \"isvc-logger-predictor-74744fb9f-pkn5z\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.175881 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.175859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:13.297568 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:13.297529 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z"] Apr 24 21:29:13.300682 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:29:13.300655 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94c95a50_9fb3_41c1_98c6_3ccd6570bcf5.slice/crio-73c39459254e0352b0f820a6a7febc1dee6040361499940f86cb7c52ffedf065 WatchSource:0}: Error finding container 73c39459254e0352b0f820a6a7febc1dee6040361499940f86cb7c52ffedf065: Status 404 returned error can't find the container with id 73c39459254e0352b0f820a6a7febc1dee6040361499940f86cb7c52ffedf065 Apr 24 21:29:14.110341 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:14.110308 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerStarted","Data":"c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8"} Apr 24 21:29:14.110341 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:14.110344 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerStarted","Data":"73c39459254e0352b0f820a6a7febc1dee6040361499940f86cb7c52ffedf065"} Apr 24 21:29:15.817027 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:15.816990 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 21:29:17.121123 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:17.121058 2578 generic.go:358] "Generic (PLEG): container finished" podID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerID="c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8" exitCode=0 Apr 24 21:29:17.121123 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:17.121092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerDied","Data":"c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8"} Apr 24 21:29:18.130038 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.129999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerStarted","Data":"784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf"} Apr 24 21:29:18.130457 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.130046 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerStarted","Data":"656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423"} Apr 24 21:29:18.130457 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.130062 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerStarted","Data":"88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf"} Apr 24 21:29:18.130817 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.130785 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:18.130952 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.130822 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:18.130952 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.130834 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:18.132252 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.132221 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:29:18.133371 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.133340 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:18.151235 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:18.151194 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podStartSLOduration=6.151183555 podStartE2EDuration="6.151183555s" podCreationTimestamp="2026-04-24 21:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:18.150353727 +0000 UTC m=+782.118155790" watchObservedRunningTime="2026-04-24 21:29:18.151183555 +0000 UTC m=+782.118985616" Apr 24 21:29:19.134135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:19.134093 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:29:19.134565 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:19.134427 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:20.817358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:20.817322 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 21:29:20.821725 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:20.821685 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:5000: connect: connection refused" Apr 24 21:29:20.821871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:20.821827 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:29:20.822044 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:20.822011 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:20.822137 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:20.822120 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:29:22.887097 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.887076 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:29:22.987631 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.987608 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kserve-provision-location\") pod \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " Apr 24 21:29:22.987776 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.987643 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d0f86f7-103c-47ac-81ba-2cafe28b7661-proxy-tls\") pod \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " Apr 24 21:29:22.987776 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.987710 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d0f86f7-103c-47ac-81ba-2cafe28b7661-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " Apr 24 21:29:22.987776 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.987730 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd7tz\" (UniqueName: \"kubernetes.io/projected/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kube-api-access-gd7tz\") pod \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\" (UID: \"1d0f86f7-103c-47ac-81ba-2cafe28b7661\") " Apr 24 21:29:22.988002 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.987972 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1d0f86f7-103c-47ac-81ba-2cafe28b7661" (UID: "1d0f86f7-103c-47ac-81ba-2cafe28b7661"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:22.988066 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.988050 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0f86f7-103c-47ac-81ba-2cafe28b7661-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "1d0f86f7-103c-47ac-81ba-2cafe28b7661" (UID: "1d0f86f7-103c-47ac-81ba-2cafe28b7661"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:22.989879 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.989864 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0f86f7-103c-47ac-81ba-2cafe28b7661-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1d0f86f7-103c-47ac-81ba-2cafe28b7661" (UID: "1d0f86f7-103c-47ac-81ba-2cafe28b7661"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:22.990000 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:22.989979 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kube-api-access-gd7tz" (OuterVolumeSpecName: "kube-api-access-gd7tz") pod "1d0f86f7-103c-47ac-81ba-2cafe28b7661" (UID: "1d0f86f7-103c-47ac-81ba-2cafe28b7661"). InnerVolumeSpecName "kube-api-access-gd7tz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:23.089257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.089197 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d0f86f7-103c-47ac-81ba-2cafe28b7661-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:29:23.089257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.089220 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gd7tz\" (UniqueName: \"kubernetes.io/projected/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kube-api-access-gd7tz\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:29:23.089257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.089231 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d0f86f7-103c-47ac-81ba-2cafe28b7661-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:29:23.089257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.089240 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d0f86f7-103c-47ac-81ba-2cafe28b7661-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:29:23.148637 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.148606 2578 generic.go:358] "Generic (PLEG): container finished" podID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerID="7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0" exitCode=0 Apr 24 21:29:23.148743 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.148694 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" Apr 24 21:29:23.148848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.148688 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerDied","Data":"7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0"} Apr 24 21:29:23.148848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.148803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj" event={"ID":"1d0f86f7-103c-47ac-81ba-2cafe28b7661","Type":"ContainerDied","Data":"55e1dd58195df08fba13bfbe818c80bae9c91b48fb63d02b1eca932a5326819e"} Apr 24 21:29:23.148848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.148820 2578 scope.go:117] "RemoveContainer" containerID="7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0" Apr 24 21:29:23.161998 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.161981 2578 scope.go:117] "RemoveContainer" containerID="72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0" Apr 24 21:29:23.169291 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.169270 2578 scope.go:117] "RemoveContainer" containerID="c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d" Apr 24 21:29:23.175396 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.175178 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj"] Apr 24 21:29:23.177793 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.177771 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f8556cf45-nnkcj"] Apr 24 21:29:23.178310 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.178294 2578 scope.go:117] "RemoveContainer" containerID="6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6" Apr 24 21:29:23.185146 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.185121 2578 scope.go:117] "RemoveContainer" containerID="7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0" Apr 24 21:29:23.185386 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:29:23.185368 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0\": container with ID starting with 7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0 not found: ID does not exist" containerID="7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0" Apr 24 21:29:23.185440 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.185393 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0"} err="failed to get container status \"7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0\": rpc error: code = NotFound desc = could not find container \"7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0\": container with ID starting with 7af80f32440f598dc6906f8891017c17530b475471cb8d083c23b2b6867a91f0 not found: ID does not exist" Apr 24 21:29:23.185440 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.185412 2578 scope.go:117] "RemoveContainer" containerID="72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0" Apr 24 21:29:23.185638 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:29:23.185617 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0\": container with ID starting with 72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0 not found: ID does not exist" containerID="72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0" Apr 24 21:29:23.185681 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.185645 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0"} err="failed to get container status \"72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0\": rpc error: code = NotFound desc = could not find container \"72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0\": container with ID starting with 72bd0b8ce3ab869e7f71fad60db917a159b10c87f77906983c2f600937802dd0 not found: ID does not exist" Apr 24 21:29:23.185681 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.185661 2578 scope.go:117] "RemoveContainer" containerID="c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d" Apr 24 21:29:23.185910 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:29:23.185893 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d\": container with ID starting with c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d not found: ID does not exist" containerID="c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d" Apr 24 21:29:23.185955 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.185917 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d"} err="failed to get container status \"c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d\": rpc error: code = NotFound desc = could not find container \"c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d\": container with ID starting with c21c21ff0723cae4591d9b4b69092bd475e267a8df4b953f204a908fa03ac32d not found: ID does not exist" Apr 24 21:29:23.185955 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.185932 2578 scope.go:117] "RemoveContainer" containerID="6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6" Apr 24 21:29:23.186156 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:29:23.186139 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6\": container with ID starting with 6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6 not found: ID does not exist" containerID="6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6" Apr 24 21:29:23.186193 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:23.186163 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6"} err="failed to get container status \"6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6\": rpc error: code = NotFound desc = could not find container \"6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6\": container with ID starting with 6a300200d1e45dd2b7007af00a871bb4cc2229526d210c750f08f472630dc3a6 not found: ID does not exist" Apr 24 21:29:24.137659 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:24.137633 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:29:24.138186 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:24.138157 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:29:24.138501 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:24.138477 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:24.659381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:24.659315 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" path="/var/lib/kubelet/pods/1d0f86f7-103c-47ac-81ba-2cafe28b7661/volumes" Apr 24 21:29:34.138125 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:34.138084 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:29:34.138595 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:34.138436 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:44.138699 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:44.138664 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:29:44.139272 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:44.139244 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:54.138240 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:54.138192 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:29:54.138647 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:29:54.138483 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:04.138633 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:04.138591 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:30:04.139117 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:04.139096 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:14.138446 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:14.138402 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:30:14.138917 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:14.138893 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:24.139365 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:24.139336 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:30:24.139848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:24.139468 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:30:37.872764 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:37.872728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-npgqt_1a8b3832-4b60-40a3-8ac3-4340624f9a3c/kserve-container/0.log" Apr 24 21:30:38.056920 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.056877 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z"] Apr 24 21:30:38.057365 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.057332 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" containerID="cri-o://88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf" gracePeriod=30 Apr 24 21:30:38.057365 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.057344 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" containerID="cri-o://784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf" gracePeriod=30 Apr 24 21:30:38.057551 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.057376 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" containerID="cri-o://656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423" gracePeriod=30 Apr 24 21:30:38.128390 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128329 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd"] Apr 24 21:30:38.128686 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128674 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" Apr 24 21:30:38.128726 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128687 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" Apr 24 21:30:38.128726 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128706 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" Apr 24 21:30:38.128726 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128711 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" Apr 24 21:30:38.128726 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128721 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" Apr 24 21:30:38.128726 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128726 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" Apr 24 21:30:38.128890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128735 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="storage-initializer" Apr 24 21:30:38.128890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128741 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="storage-initializer" Apr 24 21:30:38.128890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128828 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kube-rbac-proxy" Apr 24 21:30:38.128890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128838 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="agent" Apr 24 21:30:38.128890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.128847 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0f86f7-103c-47ac-81ba-2cafe28b7661" containerName="kserve-container" Apr 24 21:30:38.132176 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.132154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.134431 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.134406 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 24 21:30:38.134550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.134412 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 21:30:38.146438 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.146418 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd"] Apr 24 21:30:38.200327 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.200302 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt"] Apr 24 21:30:38.200598 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.200560 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerName="kserve-container" containerID="cri-o://ba73df103d8dedf99d909526e1b575e966d3751ea47094f583eb8a1d601663b7" gracePeriod=30 Apr 24 21:30:38.200741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.200602 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerName="kube-rbac-proxy" containerID="cri-o://3c8b07915beaaeb4b08e212e850f485abdf8cd93a290fc942701b610a6cef627" gracePeriod=30 Apr 24 21:30:38.226951 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.226923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2h8b\" (UniqueName: \"kubernetes.io/projected/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kube-api-access-b2h8b\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.227060 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.227036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.227101 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.227078 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.227161 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.227146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.327667 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.327639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.327817 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.327699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.327817 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.327736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2h8b\" (UniqueName: \"kubernetes.io/projected/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kube-api-access-b2h8b\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.327924 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.327849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.328129 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.328052 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.328699 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.328675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.330308 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.330286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.335677 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.335650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2h8b\" (UniqueName: \"kubernetes.io/projected/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kube-api-access-b2h8b\") pod \"isvc-lightgbm-predictor-bdf964bd-c66zd\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.384348 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.384285 2578 generic.go:358] "Generic (PLEG): container finished" podID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerID="3c8b07915beaaeb4b08e212e850f485abdf8cd93a290fc942701b610a6cef627" exitCode=2 Apr 24 21:30:38.384348 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.384309 2578 generic.go:358] "Generic (PLEG): container finished" podID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerID="ba73df103d8dedf99d909526e1b575e966d3751ea47094f583eb8a1d601663b7" exitCode=2 Apr 24 21:30:38.384515 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.384357 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" event={"ID":"1a8b3832-4b60-40a3-8ac3-4340624f9a3c","Type":"ContainerDied","Data":"3c8b07915beaaeb4b08e212e850f485abdf8cd93a290fc942701b610a6cef627"} Apr 24 21:30:38.384515 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.384393 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" event={"ID":"1a8b3832-4b60-40a3-8ac3-4340624f9a3c","Type":"ContainerDied","Data":"ba73df103d8dedf99d909526e1b575e966d3751ea47094f583eb8a1d601663b7"} Apr 24 21:30:38.386544 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.386490 2578 generic.go:358] "Generic (PLEG): container finished" podID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerID="656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423" exitCode=2 Apr 24 21:30:38.386544 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.386538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerDied","Data":"656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423"} Apr 24 21:30:38.435886 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.435867 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:30:38.443078 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.443058 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:38.530237 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.530205 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-message-dumper-kube-rbac-proxy-sar-config\") pod \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " Apr 24 21:30:38.530369 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.530249 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26f6v\" (UniqueName: \"kubernetes.io/projected/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-kube-api-access-26f6v\") pod \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " Apr 24 21:30:38.530429 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.530406 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls\") pod \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\" (UID: \"1a8b3832-4b60-40a3-8ac3-4340624f9a3c\") " Apr 24 21:30:38.530711 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.530670 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "1a8b3832-4b60-40a3-8ac3-4340624f9a3c" (UID: "1a8b3832-4b60-40a3-8ac3-4340624f9a3c"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:38.533206 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.533179 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-kube-api-access-26f6v" (OuterVolumeSpecName: "kube-api-access-26f6v") pod "1a8b3832-4b60-40a3-8ac3-4340624f9a3c" (UID: "1a8b3832-4b60-40a3-8ac3-4340624f9a3c"). InnerVolumeSpecName "kube-api-access-26f6v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:38.533274 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.533240 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1a8b3832-4b60-40a3-8ac3-4340624f9a3c" (UID: "1a8b3832-4b60-40a3-8ac3-4340624f9a3c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:38.561683 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.561658 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd"] Apr 24 21:30:38.564231 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:30:38.564205 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f80a6c_f625_4cd5_a137_4fe7a6b8dca3.slice/crio-f4aa8f9c38f6f6d9d5f979cafa51a644ec9ad28af372d3056d604b6427c3df56 WatchSource:0}: Error finding container f4aa8f9c38f6f6d9d5f979cafa51a644ec9ad28af372d3056d604b6427c3df56: Status 404 returned error can't find the container with id f4aa8f9c38f6f6d9d5f979cafa51a644ec9ad28af372d3056d604b6427c3df56 Apr 24 21:30:38.631253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.631231 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:30:38.631253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.631254 2578 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:30:38.631369 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:38.631264 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26f6v\" (UniqueName: \"kubernetes.io/projected/1a8b3832-4b60-40a3-8ac3-4340624f9a3c-kube-api-access-26f6v\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:30:39.134135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.134092 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 21:30:39.391665 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.391575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" event={"ID":"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3","Type":"ContainerStarted","Data":"d0e56d51c0abf4abcef44b0b099e8132dd00291249b98b343ec6b0c034e5df79"} Apr 24 21:30:39.391665 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.391618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" event={"ID":"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3","Type":"ContainerStarted","Data":"f4aa8f9c38f6f6d9d5f979cafa51a644ec9ad28af372d3056d604b6427c3df56"} Apr 24 21:30:39.392878 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.392851 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" event={"ID":"1a8b3832-4b60-40a3-8ac3-4340624f9a3c","Type":"ContainerDied","Data":"8ed93ed288b54ac867c1784b7250509ef9aac7e2f5cbfbfc8095ca693199eae2"} Apr 24 21:30:39.392994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.392892 2578 scope.go:117] "RemoveContainer" containerID="3c8b07915beaaeb4b08e212e850f485abdf8cd93a290fc942701b610a6cef627" Apr 24 21:30:39.392994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.392909 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt" Apr 24 21:30:39.400629 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.400611 2578 scope.go:117] "RemoveContainer" containerID="ba73df103d8dedf99d909526e1b575e966d3751ea47094f583eb8a1d601663b7" Apr 24 21:30:39.422488 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.422458 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt"] Apr 24 21:30:39.427567 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:39.427544 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-npgqt"] Apr 24 21:30:40.660000 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:40.659964 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" path="/var/lib/kubelet/pods/1a8b3832-4b60-40a3-8ac3-4340624f9a3c/volumes" Apr 24 21:30:42.408549 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:42.408519 2578 generic.go:358] "Generic (PLEG): container finished" podID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerID="88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf" exitCode=0 Apr 24 21:30:42.408906 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:42.408587 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerDied","Data":"88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf"} Apr 24 21:30:43.413257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:43.413224 2578 generic.go:358] "Generic (PLEG): container finished" podID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerID="d0e56d51c0abf4abcef44b0b099e8132dd00291249b98b343ec6b0c034e5df79" exitCode=0 Apr 24 21:30:43.413617 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:43.413299 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" event={"ID":"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3","Type":"ContainerDied","Data":"d0e56d51c0abf4abcef44b0b099e8132dd00291249b98b343ec6b0c034e5df79"} Apr 24 21:30:44.134936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:44.134877 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 21:30:44.138555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:44.138262 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:30:44.139120 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:44.139092 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:49.134906 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:49.134861 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 21:30:49.135344 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:49.135012 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:30:50.441317 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:50.441284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" event={"ID":"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3","Type":"ContainerStarted","Data":"e4724f9b051b2e1e58552e5f94bfd71e5e264faf0a883faeb8effde7cc8a91b0"} Apr 24 21:30:50.441800 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:50.441326 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" event={"ID":"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3","Type":"ContainerStarted","Data":"0019c636472aad7d8e37019f7646a49e10af6d2e4b85879bd64152b422d1d341"} Apr 24 21:30:50.441800 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:50.441615 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:50.441800 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:50.441640 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:50.442672 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:50.442643 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:30:50.473214 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:50.473164 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podStartSLOduration=6.322496921 podStartE2EDuration="12.473153328s" podCreationTimestamp="2026-04-24 21:30:38 +0000 UTC" firstStartedPulling="2026-04-24 21:30:43.414434377 +0000 UTC m=+867.382236417" lastFinishedPulling="2026-04-24 21:30:49.565090779 +0000 UTC m=+873.532892824" observedRunningTime="2026-04-24 21:30:50.472253354 +0000 UTC m=+874.440055416" watchObservedRunningTime="2026-04-24 21:30:50.473153328 +0000 UTC m=+874.440955389" Apr 24 21:30:51.444977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:51.444935 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:30:54.134628 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:54.134583 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 21:30:54.138902 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:54.138875 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:30:54.139248 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:54.139229 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:56.448930 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:56.448860 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:30:56.449486 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:56.449462 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:30:59.134696 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:30:59.134651 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 21:31:04.134525 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:04.134487 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 21:31:04.138900 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:04.138878 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:31:04.139005 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:04.138991 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:31:04.139307 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:04.139286 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:31:04.139392 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:04.139380 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:31:06.449871 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:06.449835 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:31:08.227608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.227585 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:31:08.367361 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.367294 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kserve-provision-location\") pod \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " Apr 24 21:31:08.367361 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.367328 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs5t9\" (UniqueName: \"kubernetes.io/projected/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kube-api-access-qs5t9\") pod \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " Apr 24 21:31:08.367520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.367388 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-proxy-tls\") pod \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " Apr 24 21:31:08.367520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.367433 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-isvc-logger-kube-rbac-proxy-sar-config\") pod \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\" (UID: \"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5\") " Apr 24 21:31:08.367617 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.367580 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" (UID: "94c95a50-9fb3-41c1-98c6-3ccd6570bcf5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:08.367860 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.367834 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" (UID: "94c95a50-9fb3-41c1-98c6-3ccd6570bcf5"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:08.369558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.369529 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" (UID: "94c95a50-9fb3-41c1-98c6-3ccd6570bcf5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:08.369632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.369562 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kube-api-access-qs5t9" (OuterVolumeSpecName: "kube-api-access-qs5t9") pod "94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" (UID: "94c95a50-9fb3-41c1-98c6-3ccd6570bcf5"). InnerVolumeSpecName "kube-api-access-qs5t9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:08.467916 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.467895 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:31:08.467916 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.467914 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:31:08.468031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.467924 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:31:08.468031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.467934 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qs5t9\" (UniqueName: \"kubernetes.io/projected/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5-kube-api-access-qs5t9\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:31:08.501655 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.501624 2578 generic.go:358] "Generic (PLEG): container finished" podID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerID="784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf" exitCode=137 Apr 24 21:31:08.501740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.501673 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerDied","Data":"784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf"} Apr 24 21:31:08.501740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.501707 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" event={"ID":"94c95a50-9fb3-41c1-98c6-3ccd6570bcf5","Type":"ContainerDied","Data":"73c39459254e0352b0f820a6a7febc1dee6040361499940f86cb7c52ffedf065"} Apr 24 21:31:08.501740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.501708 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z" Apr 24 21:31:08.501740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.501720 2578 scope.go:117] "RemoveContainer" containerID="784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf" Apr 24 21:31:08.509623 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.509606 2578 scope.go:117] "RemoveContainer" containerID="656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423" Apr 24 21:31:08.516520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.516503 2578 scope.go:117] "RemoveContainer" containerID="88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf" Apr 24 21:31:08.523488 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.523469 2578 scope.go:117] "RemoveContainer" containerID="c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8" Apr 24 21:31:08.525618 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.525595 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z"] Apr 24 21:31:08.528891 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.528870 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-74744fb9f-pkn5z"] Apr 24 21:31:08.531262 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.531245 2578 scope.go:117] "RemoveContainer" containerID="784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf" Apr 24 21:31:08.531490 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:31:08.531472 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf\": container with ID starting with 784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf not found: ID does not exist" containerID="784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf" Apr 24 21:31:08.531538 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.531499 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf"} err="failed to get container status \"784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf\": rpc error: code = NotFound desc = could not find container \"784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf\": container with ID starting with 784576cd2cd46ae621c6a1519560e72073d3fc42b44d140bedb202235e496aaf not found: ID does not exist" Apr 24 21:31:08.531538 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.531518 2578 scope.go:117] "RemoveContainer" containerID="656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423" Apr 24 21:31:08.531773 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:31:08.531740 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423\": container with ID starting with 656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423 not found: ID does not exist" containerID="656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423" Apr 24 21:31:08.531838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.531776 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423"} err="failed to get container status \"656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423\": rpc error: code = NotFound desc = could not find container \"656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423\": container with ID starting with 656109df7d6de9bd8dea22293b21474594330161167081f7a0f2307c8f1dd423 not found: ID does not exist" Apr 24 21:31:08.531838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.531789 2578 scope.go:117] "RemoveContainer" containerID="88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf" Apr 24 21:31:08.531964 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:31:08.531948 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf\": container with ID starting with 88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf not found: ID does not exist" containerID="88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf" Apr 24 21:31:08.531999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.531970 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf"} err="failed to get container status \"88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf\": rpc error: code = NotFound desc = could not find container \"88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf\": container with ID starting with 88a3d88e647a2606a4d9ce3dafb05b2fd48a94a7290929cc79c1673bd90f8fdf not found: ID does not exist" Apr 24 21:31:08.531999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.531992 2578 scope.go:117] "RemoveContainer" containerID="c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8" Apr 24 21:31:08.532227 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:31:08.532207 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8\": container with ID starting with c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8 not found: ID does not exist" containerID="c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8" Apr 24 21:31:08.532281 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.532235 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8"} err="failed to get container status \"c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8\": rpc error: code = NotFound desc = could not find container \"c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8\": container with ID starting with c7c0981fdcfecf6745fb0b5f60c9751cd19150561c5b12ec30aa7ba2aa3499a8 not found: ID does not exist" Apr 24 21:31:08.660096 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:08.660071 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" path="/var/lib/kubelet/pods/94c95a50-9fb3-41c1-98c6-3ccd6570bcf5/volumes" Apr 24 21:31:16.450300 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:16.450258 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:31:16.585911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:16.585884 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:31:16.586504 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:16.586480 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:31:26.450217 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:26.450175 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:31:36.449829 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:36.449787 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:31:46.449650 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:46.449614 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:31:56.449991 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:56.449945 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:31:57.656887 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:57.656860 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:31:58.193382 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.193350 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd"] Apr 24 21:31:58.281335 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281301 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6"] Apr 24 21:31:58.281625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281614 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerName="kube-rbac-proxy" Apr 24 21:31:58.281667 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281627 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerName="kube-rbac-proxy" Apr 24 21:31:58.281667 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281639 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" Apr 24 21:31:58.281667 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281645 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" Apr 24 21:31:58.281667 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281659 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" Apr 24 21:31:58.281667 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281664 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281671 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerName="kserve-container" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281676 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerName="kserve-container" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281685 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="storage-initializer" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281690 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="storage-initializer" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281697 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281702 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281768 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="agent" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281779 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerName="kube-rbac-proxy" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281786 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a8b3832-4b60-40a3-8ac3-4340624f9a3c" containerName="kserve-container" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281793 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kserve-container" Apr 24 21:31:58.281837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.281802 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="94c95a50-9fb3-41c1-98c6-3ccd6570bcf5" containerName="kube-rbac-proxy" Apr 24 21:31:58.284980 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.284962 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.287352 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.287333 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:31:58.287436 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.287332 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 24 21:31:58.295559 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.295534 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6"] Apr 24 21:31:58.437283 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.437257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4a7438a-52d2-4721-b3a2-6b381804f88c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.437423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.437296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4a7438a-52d2-4721-b3a2-6b381804f88c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.437423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.437344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csmgm\" (UniqueName: \"kubernetes.io/projected/c4a7438a-52d2-4721-b3a2-6b381804f88c-kube-api-access-csmgm\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.437423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.437364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4a7438a-52d2-4721-b3a2-6b381804f88c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.538276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.538210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csmgm\" (UniqueName: \"kubernetes.io/projected/c4a7438a-52d2-4721-b3a2-6b381804f88c-kube-api-access-csmgm\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.538276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.538242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4a7438a-52d2-4721-b3a2-6b381804f88c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.538479 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.538284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4a7438a-52d2-4721-b3a2-6b381804f88c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.538479 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.538309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4a7438a-52d2-4721-b3a2-6b381804f88c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.538610 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.538593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4a7438a-52d2-4721-b3a2-6b381804f88c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.539003 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.538980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4a7438a-52d2-4721-b3a2-6b381804f88c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.540841 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.540819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4a7438a-52d2-4721-b3a2-6b381804f88c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.548359 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.548336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csmgm\" (UniqueName: \"kubernetes.io/projected/c4a7438a-52d2-4721-b3a2-6b381804f88c-kube-api-access-csmgm\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.596881 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.596856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:31:58.721627 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:58.721596 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6"] Apr 24 21:31:58.724661 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:31:58.724626 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4a7438a_52d2_4721_b3a2_6b381804f88c.slice/crio-d8829d8cb65fec9e634dc871332239efa256cb9cb2199cc0c855947a2b43b5c5 WatchSource:0}: Error finding container d8829d8cb65fec9e634dc871332239efa256cb9cb2199cc0c855947a2b43b5c5: Status 404 returned error can't find the container with id d8829d8cb65fec9e634dc871332239efa256cb9cb2199cc0c855947a2b43b5c5 Apr 24 21:31:58.739996 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:31:58.739974 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:31:58.740074 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:31:58.740055 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls podName:a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:59.240033859 +0000 UTC m=+943.207835915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-c66zd" (UID: "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:31:59.244221 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:31:59.244190 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:31:59.244374 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:31:59.244252 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls podName:a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:00.244238645 +0000 UTC m=+944.212040685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-c66zd" (UID: "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:31:59.653281 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:59.653245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" event={"ID":"c4a7438a-52d2-4721-b3a2-6b381804f88c","Type":"ContainerStarted","Data":"5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db"} Apr 24 21:31:59.653436 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:59.653288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" event={"ID":"c4a7438a-52d2-4721-b3a2-6b381804f88c","Type":"ContainerStarted","Data":"d8829d8cb65fec9e634dc871332239efa256cb9cb2199cc0c855947a2b43b5c5"} Apr 24 21:31:59.653497 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:59.653425 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" containerID="cri-o://0019c636472aad7d8e37019f7646a49e10af6d2e4b85879bd64152b422d1d341" gracePeriod=30 Apr 24 21:31:59.653497 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:31:59.653463 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kube-rbac-proxy" containerID="cri-o://e4724f9b051b2e1e58552e5f94bfd71e5e264faf0a883faeb8effde7cc8a91b0" gracePeriod=30 Apr 24 21:32:00.252236 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:32:00.252202 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:32:00.252650 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:32:00.252277 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls podName:a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:02.252257789 +0000 UTC m=+946.220059830 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-c66zd" (UID: "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:32:00.658862 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:00.658831 2578 generic.go:358] "Generic (PLEG): container finished" podID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerID="e4724f9b051b2e1e58552e5f94bfd71e5e264faf0a883faeb8effde7cc8a91b0" exitCode=2 Apr 24 21:32:00.659476 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:00.659453 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" event={"ID":"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3","Type":"ContainerDied","Data":"e4724f9b051b2e1e58552e5f94bfd71e5e264faf0a883faeb8effde7cc8a91b0"} Apr 24 21:32:01.445870 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:01.445831 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.35:8643/healthz\": dial tcp 10.134.0.35:8643: connect: connection refused" Apr 24 21:32:02.269026 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:32:02.268990 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:32:02.269192 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:32:02.269058 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls podName:a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:06.269044123 +0000 UTC m=+950.236846164 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-c66zd" (UID: "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:32:02.666400 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:02.666366 2578 generic.go:358] "Generic (PLEG): container finished" podID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerID="5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db" exitCode=0 Apr 24 21:32:02.666765 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:02.666441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" event={"ID":"c4a7438a-52d2-4721-b3a2-6b381804f88c","Type":"ContainerDied","Data":"5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db"} Apr 24 21:32:03.671354 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:03.671327 2578 generic.go:358] "Generic (PLEG): container finished" podID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerID="0019c636472aad7d8e37019f7646a49e10af6d2e4b85879bd64152b422d1d341" exitCode=0 Apr 24 21:32:03.671666 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:03.671396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" event={"ID":"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3","Type":"ContainerDied","Data":"0019c636472aad7d8e37019f7646a49e10af6d2e4b85879bd64152b422d1d341"} Apr 24 21:32:03.673207 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:03.673184 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" event={"ID":"c4a7438a-52d2-4721-b3a2-6b381804f88c","Type":"ContainerStarted","Data":"6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b"} Apr 24 21:32:03.673311 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:03.673210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" event={"ID":"c4a7438a-52d2-4721-b3a2-6b381804f88c","Type":"ContainerStarted","Data":"8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889"} Apr 24 21:32:03.673499 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:03.673481 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:32:03.673629 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:03.673613 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:32:03.674685 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:03.674661 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:32:03.692526 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:03.692479 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podStartSLOduration=5.692466917 podStartE2EDuration="5.692466917s" podCreationTimestamp="2026-04-24 21:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:32:03.690274485 +0000 UTC m=+947.658076546" watchObservedRunningTime="2026-04-24 21:32:03.692466917 +0000 UTC m=+947.660268979" Apr 24 21:32:04.089025 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.089004 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:32:04.185251 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.185200 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " Apr 24 21:32:04.185251 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.185259 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kserve-provision-location\") pod \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " Apr 24 21:32:04.185487 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.185291 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2h8b\" (UniqueName: \"kubernetes.io/projected/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kube-api-access-b2h8b\") pod \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " Apr 24 21:32:04.185487 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.185386 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls\") pod \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\" (UID: \"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3\") " Apr 24 21:32:04.185578 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.185546 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" (UID: "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:04.185647 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.185629 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:32:04.185682 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.185634 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" (UID: "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:04.187520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.187491 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kube-api-access-b2h8b" (OuterVolumeSpecName: "kube-api-access-b2h8b") pod "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" (UID: "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3"). InnerVolumeSpecName "kube-api-access-b2h8b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:04.187520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.187497 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" (UID: "a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:04.286223 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.286131 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:32:04.286223 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.286173 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:32:04.286223 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.286188 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2h8b\" (UniqueName: \"kubernetes.io/projected/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3-kube-api-access-b2h8b\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:32:04.677163 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.677139 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" Apr 24 21:32:04.677482 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.677134 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd" event={"ID":"a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3","Type":"ContainerDied","Data":"f4aa8f9c38f6f6d9d5f979cafa51a644ec9ad28af372d3056d604b6427c3df56"} Apr 24 21:32:04.677482 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.677272 2578 scope.go:117] "RemoveContainer" containerID="e4724f9b051b2e1e58552e5f94bfd71e5e264faf0a883faeb8effde7cc8a91b0" Apr 24 21:32:04.677708 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.677584 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:32:04.687711 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.687692 2578 scope.go:117] "RemoveContainer" containerID="0019c636472aad7d8e37019f7646a49e10af6d2e4b85879bd64152b422d1d341" Apr 24 21:32:04.695285 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.695267 2578 scope.go:117] "RemoveContainer" containerID="d0e56d51c0abf4abcef44b0b099e8132dd00291249b98b343ec6b0c034e5df79" Apr 24 21:32:04.695703 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.695682 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd"] Apr 24 21:32:04.699525 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:04.699503 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-c66zd"] Apr 24 21:32:06.660984 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:06.660954 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" path="/var/lib/kubelet/pods/a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3/volumes" Apr 24 21:32:09.681380 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:09.681355 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:32:09.681852 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:09.681828 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:32:19.681836 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:19.681796 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:32:29.682010 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:29.681931 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:32:39.681890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:39.681853 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:32:49.681962 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:49.681924 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:32:59.682093 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:32:59.682051 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:33:09.682418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:09.682382 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:33:12.660081 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:12.660054 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:33:19.048334 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.048298 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6"] Apr 24 21:33:19.048899 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.048610 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" containerID="cri-o://8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889" gracePeriod=30 Apr 24 21:33:19.048899 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.048684 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kube-rbac-proxy" containerID="cri-o://6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b" gracePeriod=30 Apr 24 21:33:19.140686 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.140657 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq"] Apr 24 21:33:19.141047 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.141033 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="storage-initializer" Apr 24 21:33:19.141093 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.141050 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="storage-initializer" Apr 24 21:33:19.141093 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.141061 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" Apr 24 21:33:19.141093 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.141069 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" Apr 24 21:33:19.141093 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.141089 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kube-rbac-proxy" Apr 24 21:33:19.141226 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.141095 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kube-rbac-proxy" Apr 24 21:33:19.141226 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.141159 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kube-rbac-proxy" Apr 24 21:33:19.141226 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.141169 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7f80a6c-f625-4cd5-a137-4fe7a6b8dca3" containerName="kserve-container" Apr 24 21:33:19.144579 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.144556 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.146815 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.146738 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:33:19.146929 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.146850 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 24 21:33:19.164562 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.164526 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq"] Apr 24 21:33:19.228465 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.228440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e860b44-c98e-448a-bd09-50c263cdec59-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.228596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.228481 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e860b44-c98e-448a-bd09-50c263cdec59-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.228596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.228507 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e860b44-c98e-448a-bd09-50c263cdec59-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.228705 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.228589 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk82q\" (UniqueName: \"kubernetes.io/projected/1e860b44-c98e-448a-bd09-50c263cdec59-kube-api-access-hk82q\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.329067 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.329005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e860b44-c98e-448a-bd09-50c263cdec59-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.329067 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.329043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e860b44-c98e-448a-bd09-50c263cdec59-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.329288 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.329075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk82q\" (UniqueName: \"kubernetes.io/projected/1e860b44-c98e-448a-bd09-50c263cdec59-kube-api-access-hk82q\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.329288 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.329131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e860b44-c98e-448a-bd09-50c263cdec59-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.329421 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.329387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e860b44-c98e-448a-bd09-50c263cdec59-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.329728 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.329708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e860b44-c98e-448a-bd09-50c263cdec59-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.331551 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.331531 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e860b44-c98e-448a-bd09-50c263cdec59-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.337023 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.336997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk82q\" (UniqueName: \"kubernetes.io/projected/1e860b44-c98e-448a-bd09-50c263cdec59-kube-api-access-hk82q\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.456781 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.456738 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:33:19.575412 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.575348 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq"] Apr 24 21:33:19.577605 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:33:19.577577 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e860b44_c98e_448a_bd09_50c263cdec59.slice/crio-bf184a8b31aadb6e40b85510974e88859339422eea2b28024633c6d67e37ebda WatchSource:0}: Error finding container bf184a8b31aadb6e40b85510974e88859339422eea2b28024633c6d67e37ebda: Status 404 returned error can't find the container with id bf184a8b31aadb6e40b85510974e88859339422eea2b28024633c6d67e37ebda Apr 24 21:33:19.579446 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.579394 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:33:19.677945 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.677915 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 24 21:33:19.894498 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.894408 2578 generic.go:358] "Generic (PLEG): container finished" podID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerID="6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b" exitCode=2 Apr 24 21:33:19.894645 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.894494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" event={"ID":"c4a7438a-52d2-4721-b3a2-6b381804f88c","Type":"ContainerDied","Data":"6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b"} Apr 24 21:33:19.895795 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.895768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" event={"ID":"1e860b44-c98e-448a-bd09-50c263cdec59","Type":"ContainerStarted","Data":"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d"} Apr 24 21:33:19.895931 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:19.895801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" event={"ID":"1e860b44-c98e-448a-bd09-50c263cdec59","Type":"ContainerStarted","Data":"bf184a8b31aadb6e40b85510974e88859339422eea2b28024633c6d67e37ebda"} Apr 24 21:33:22.656084 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:22.656049 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:33:23.283906 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.283885 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:33:23.361040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.361019 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4a7438a-52d2-4721-b3a2-6b381804f88c-proxy-tls\") pod \"c4a7438a-52d2-4721-b3a2-6b381804f88c\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " Apr 24 21:33:23.361161 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.361094 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csmgm\" (UniqueName: \"kubernetes.io/projected/c4a7438a-52d2-4721-b3a2-6b381804f88c-kube-api-access-csmgm\") pod \"c4a7438a-52d2-4721-b3a2-6b381804f88c\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " Apr 24 21:33:23.361161 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.361127 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4a7438a-52d2-4721-b3a2-6b381804f88c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"c4a7438a-52d2-4721-b3a2-6b381804f88c\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " Apr 24 21:33:23.361161 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.361154 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4a7438a-52d2-4721-b3a2-6b381804f88c-kserve-provision-location\") pod \"c4a7438a-52d2-4721-b3a2-6b381804f88c\" (UID: \"c4a7438a-52d2-4721-b3a2-6b381804f88c\") " Apr 24 21:33:23.361496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.361462 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a7438a-52d2-4721-b3a2-6b381804f88c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "c4a7438a-52d2-4721-b3a2-6b381804f88c" (UID: "c4a7438a-52d2-4721-b3a2-6b381804f88c"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:23.361609 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.361493 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a7438a-52d2-4721-b3a2-6b381804f88c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c4a7438a-52d2-4721-b3a2-6b381804f88c" (UID: "c4a7438a-52d2-4721-b3a2-6b381804f88c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:23.363121 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.363087 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a7438a-52d2-4721-b3a2-6b381804f88c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c4a7438a-52d2-4721-b3a2-6b381804f88c" (UID: "c4a7438a-52d2-4721-b3a2-6b381804f88c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:23.363260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.363241 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a7438a-52d2-4721-b3a2-6b381804f88c-kube-api-access-csmgm" (OuterVolumeSpecName: "kube-api-access-csmgm") pod "c4a7438a-52d2-4721-b3a2-6b381804f88c" (UID: "c4a7438a-52d2-4721-b3a2-6b381804f88c"). InnerVolumeSpecName "kube-api-access-csmgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:23.462333 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.462296 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-csmgm\" (UniqueName: \"kubernetes.io/projected/c4a7438a-52d2-4721-b3a2-6b381804f88c-kube-api-access-csmgm\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:33:23.462333 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.462330 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4a7438a-52d2-4721-b3a2-6b381804f88c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:33:23.462500 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.462346 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4a7438a-52d2-4721-b3a2-6b381804f88c-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:33:23.462500 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.462361 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4a7438a-52d2-4721-b3a2-6b381804f88c-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:33:23.908496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.908460 2578 generic.go:358] "Generic (PLEG): container finished" podID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerID="8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889" exitCode=0 Apr 24 21:33:23.908934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.908543 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" Apr 24 21:33:23.908934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.908550 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" event={"ID":"c4a7438a-52d2-4721-b3a2-6b381804f88c","Type":"ContainerDied","Data":"8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889"} Apr 24 21:33:23.908934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.908597 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6" event={"ID":"c4a7438a-52d2-4721-b3a2-6b381804f88c","Type":"ContainerDied","Data":"d8829d8cb65fec9e634dc871332239efa256cb9cb2199cc0c855947a2b43b5c5"} Apr 24 21:33:23.908934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.908623 2578 scope.go:117] "RemoveContainer" containerID="6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b" Apr 24 21:33:23.909934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.909905 2578 generic.go:358] "Generic (PLEG): container finished" podID="1e860b44-c98e-448a-bd09-50c263cdec59" containerID="7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d" exitCode=0 Apr 24 21:33:23.910028 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.909965 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" event={"ID":"1e860b44-c98e-448a-bd09-50c263cdec59","Type":"ContainerDied","Data":"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d"} Apr 24 21:33:23.917236 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.917110 2578 scope.go:117] "RemoveContainer" containerID="8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889" Apr 24 21:33:23.926360 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.926344 2578 scope.go:117] "RemoveContainer" containerID="5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db" Apr 24 21:33:23.933327 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.933308 2578 scope.go:117] "RemoveContainer" containerID="6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b" Apr 24 21:33:23.933633 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:33:23.933610 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b\": container with ID starting with 6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b not found: ID does not exist" containerID="6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b" Apr 24 21:33:23.933692 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.933645 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b"} err="failed to get container status \"6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b\": rpc error: code = NotFound desc = could not find container \"6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b\": container with ID starting with 6c6e70b466746a4f7e46c0e5dce1ed1a51434a7a08e417f5742b7a1d7e3ad71b not found: ID does not exist" Apr 24 21:33:23.933692 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.933666 2578 scope.go:117] "RemoveContainer" containerID="8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889" Apr 24 21:33:23.933946 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:33:23.933922 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889\": container with ID starting with 8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889 not found: ID does not exist" containerID="8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889" Apr 24 21:33:23.934045 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.933950 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889"} err="failed to get container status \"8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889\": rpc error: code = NotFound desc = could not find container \"8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889\": container with ID starting with 8582fd2efe850b05dc83f2d3b7d758ebdfecb1a9387a3a34ee86ad4544e70889 not found: ID does not exist" Apr 24 21:33:23.934045 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.933966 2578 scope.go:117] "RemoveContainer" containerID="5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db" Apr 24 21:33:23.934173 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:33:23.934157 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db\": container with ID starting with 5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db not found: ID does not exist" containerID="5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db" Apr 24 21:33:23.934216 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.934179 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db"} err="failed to get container status \"5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db\": rpc error: code = NotFound desc = could not find container \"5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db\": container with ID starting with 5a8f5078d5ede1c6dc8bae386c3c9edc93afd30b22f4c41208de641ad41ef1db not found: ID does not exist" Apr 24 21:33:23.943306 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.943283 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6"] Apr 24 21:33:23.948647 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:23.948625 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-km8t6"] Apr 24 21:33:24.663850 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:33:24.663812 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" path="/var/lib/kubelet/pods/c4a7438a-52d2-4721-b3a2-6b381804f88c/volumes" Apr 24 21:35:34.349260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:35:34.349223 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" event={"ID":"1e860b44-c98e-448a-bd09-50c263cdec59","Type":"ContainerStarted","Data":"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24"} Apr 24 21:35:34.349260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:35:34.349261 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" event={"ID":"1e860b44-c98e-448a-bd09-50c263cdec59","Type":"ContainerStarted","Data":"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc"} Apr 24 21:35:34.350015 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:35:34.349344 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:35:34.377074 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:35:34.377030 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" podStartSLOduration=5.5059557009999995 podStartE2EDuration="2m15.377018979s" podCreationTimestamp="2026-04-24 21:33:19 +0000 UTC" firstStartedPulling="2026-04-24 21:33:23.911120515 +0000 UTC m=+1027.878922556" lastFinishedPulling="2026-04-24 21:35:33.78218379 +0000 UTC m=+1157.749985834" observedRunningTime="2026-04-24 21:35:34.375424964 +0000 UTC m=+1158.343227025" watchObservedRunningTime="2026-04-24 21:35:34.377018979 +0000 UTC m=+1158.344821041" Apr 24 21:35:35.352985 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:35:35.352948 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:35:41.361068 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:35:41.361037 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:36:11.364264 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:11.364230 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:36:16.608367 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:16.608337 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:36:16.610516 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:16.610495 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:36:19.407987 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.407954 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq"] Apr 24 21:36:19.408436 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.408383 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="kserve-container" containerID="cri-o://a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc" gracePeriod=30 Apr 24 21:36:19.408506 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.408448 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="kube-rbac-proxy" containerID="cri-o://39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24" gracePeriod=30 Apr 24 21:36:19.446514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp"] Apr 24 21:36:19.446874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446860 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="storage-initializer" Apr 24 21:36:19.446874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446876 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="storage-initializer" Apr 24 21:36:19.446977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446895 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" Apr 24 21:36:19.446977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446900 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" Apr 24 21:36:19.446977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446913 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kube-rbac-proxy" Apr 24 21:36:19.446977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446920 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kube-rbac-proxy" Apr 24 21:36:19.446977 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446973 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kube-rbac-proxy" Apr 24 21:36:19.447124 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.446981 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4a7438a-52d2-4721-b3a2-6b381804f88c" containerName="kserve-container" Apr 24 21:36:19.450893 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.450870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.452975 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.452951 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:36:19.453084 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.452993 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 24 21:36:19.458711 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.458651 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp"] Apr 24 21:36:19.486922 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.486886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbd8t\" (UniqueName: \"kubernetes.io/projected/3e2e15fa-9551-4bb0-ad39-859d391d652f-kube-api-access-wbd8t\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.487053 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.486994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e2e15fa-9551-4bb0-ad39-859d391d652f-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.487140 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.487064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e2e15fa-9551-4bb0-ad39-859d391d652f-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.487140 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.487102 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e2e15fa-9551-4bb0-ad39-859d391d652f-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.588229 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.588179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e2e15fa-9551-4bb0-ad39-859d391d652f-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.588420 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.588237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e2e15fa-9551-4bb0-ad39-859d391d652f-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.588420 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.588268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e2e15fa-9551-4bb0-ad39-859d391d652f-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.588420 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.588328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbd8t\" (UniqueName: \"kubernetes.io/projected/3e2e15fa-9551-4bb0-ad39-859d391d652f-kube-api-access-wbd8t\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.588626 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.588603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e2e15fa-9551-4bb0-ad39-859d391d652f-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.588956 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.588934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e2e15fa-9551-4bb0-ad39-859d391d652f-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.591140 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.591113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e2e15fa-9551-4bb0-ad39-859d391d652f-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.596483 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.596456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbd8t\" (UniqueName: \"kubernetes.io/projected/3e2e15fa-9551-4bb0-ad39-859d391d652f-kube-api-access-wbd8t\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.763306 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.763226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:19.881520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:19.881454 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp"] Apr 24 21:36:19.884139 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:36:19.884110 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e2e15fa_9551_4bb0_ad39_859d391d652f.slice/crio-51e44ba0fa78c6e3f355b28b97c1085fc56dafa0956435518ea27aeb4ec19709 WatchSource:0}: Error finding container 51e44ba0fa78c6e3f355b28b97c1085fc56dafa0956435518ea27aeb4ec19709: Status 404 returned error can't find the container with id 51e44ba0fa78c6e3f355b28b97c1085fc56dafa0956435518ea27aeb4ec19709 Apr 24 21:36:20.457225 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.457203 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:36:20.484513 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.484483 2578 generic.go:358] "Generic (PLEG): container finished" podID="1e860b44-c98e-448a-bd09-50c263cdec59" containerID="39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24" exitCode=2 Apr 24 21:36:20.484513 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.484512 2578 generic.go:358] "Generic (PLEG): container finished" podID="1e860b44-c98e-448a-bd09-50c263cdec59" containerID="a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc" exitCode=0 Apr 24 21:36:20.484741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.484562 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" Apr 24 21:36:20.484741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.484556 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" event={"ID":"1e860b44-c98e-448a-bd09-50c263cdec59","Type":"ContainerDied","Data":"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24"} Apr 24 21:36:20.484741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.484601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" event={"ID":"1e860b44-c98e-448a-bd09-50c263cdec59","Type":"ContainerDied","Data":"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc"} Apr 24 21:36:20.484741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.484617 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq" event={"ID":"1e860b44-c98e-448a-bd09-50c263cdec59","Type":"ContainerDied","Data":"bf184a8b31aadb6e40b85510974e88859339422eea2b28024633c6d67e37ebda"} Apr 24 21:36:20.484741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.484637 2578 scope.go:117] "RemoveContainer" containerID="39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24" Apr 24 21:36:20.486491 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.486456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" event={"ID":"3e2e15fa-9551-4bb0-ad39-859d391d652f","Type":"ContainerStarted","Data":"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95"} Apr 24 21:36:20.486605 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.486508 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" event={"ID":"3e2e15fa-9551-4bb0-ad39-859d391d652f","Type":"ContainerStarted","Data":"51e44ba0fa78c6e3f355b28b97c1085fc56dafa0956435518ea27aeb4ec19709"} Apr 24 21:36:20.493315 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.493294 2578 scope.go:117] "RemoveContainer" containerID="a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc" Apr 24 21:36:20.495484 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.495463 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e860b44-c98e-448a-bd09-50c263cdec59-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"1e860b44-c98e-448a-bd09-50c263cdec59\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " Apr 24 21:36:20.495564 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.495508 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e860b44-c98e-448a-bd09-50c263cdec59-proxy-tls\") pod \"1e860b44-c98e-448a-bd09-50c263cdec59\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " Apr 24 21:36:20.495616 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.495569 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e860b44-c98e-448a-bd09-50c263cdec59-kserve-provision-location\") pod \"1e860b44-c98e-448a-bd09-50c263cdec59\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " Apr 24 21:36:20.495616 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.495605 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk82q\" (UniqueName: \"kubernetes.io/projected/1e860b44-c98e-448a-bd09-50c263cdec59-kube-api-access-hk82q\") pod \"1e860b44-c98e-448a-bd09-50c263cdec59\" (UID: \"1e860b44-c98e-448a-bd09-50c263cdec59\") " Apr 24 21:36:20.495883 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.495856 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e860b44-c98e-448a-bd09-50c263cdec59-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "1e860b44-c98e-448a-bd09-50c263cdec59" (UID: "1e860b44-c98e-448a-bd09-50c263cdec59"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:20.496041 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.496018 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e860b44-c98e-448a-bd09-50c263cdec59-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:36:20.496358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.496332 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e860b44-c98e-448a-bd09-50c263cdec59-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1e860b44-c98e-448a-bd09-50c263cdec59" (UID: "1e860b44-c98e-448a-bd09-50c263cdec59"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:36:20.498232 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.498202 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e860b44-c98e-448a-bd09-50c263cdec59-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1e860b44-c98e-448a-bd09-50c263cdec59" (UID: "1e860b44-c98e-448a-bd09-50c263cdec59"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:20.498372 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.498355 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e860b44-c98e-448a-bd09-50c263cdec59-kube-api-access-hk82q" (OuterVolumeSpecName: "kube-api-access-hk82q") pod "1e860b44-c98e-448a-bd09-50c263cdec59" (UID: "1e860b44-c98e-448a-bd09-50c263cdec59"). InnerVolumeSpecName "kube-api-access-hk82q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:20.503729 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.502774 2578 scope.go:117] "RemoveContainer" containerID="7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d" Apr 24 21:36:20.514257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.514235 2578 scope.go:117] "RemoveContainer" containerID="39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24" Apr 24 21:36:20.514506 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:36:20.514485 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24\": container with ID starting with 39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24 not found: ID does not exist" containerID="39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24" Apr 24 21:36:20.514582 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.514511 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24"} err="failed to get container status \"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24\": rpc error: code = NotFound desc = could not find container \"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24\": container with ID starting with 39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24 not found: ID does not exist" Apr 24 21:36:20.514582 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.514533 2578 scope.go:117] "RemoveContainer" containerID="a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc" Apr 24 21:36:20.514734 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:36:20.514717 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc\": container with ID starting with a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc not found: ID does not exist" containerID="a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc" Apr 24 21:36:20.514838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.514737 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc"} err="failed to get container status \"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc\": rpc error: code = NotFound desc = could not find container \"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc\": container with ID starting with a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc not found: ID does not exist" Apr 24 21:36:20.514838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.514771 2578 scope.go:117] "RemoveContainer" containerID="7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d" Apr 24 21:36:20.514952 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:36:20.514930 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d\": container with ID starting with 7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d not found: ID does not exist" containerID="7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d" Apr 24 21:36:20.515011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.514949 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d"} err="failed to get container status \"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d\": rpc error: code = NotFound desc = could not find container \"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d\": container with ID starting with 7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d not found: ID does not exist" Apr 24 21:36:20.515011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.514963 2578 scope.go:117] "RemoveContainer" containerID="39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24" Apr 24 21:36:20.515203 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.515181 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24"} err="failed to get container status \"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24\": rpc error: code = NotFound desc = could not find container \"39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24\": container with ID starting with 39eaccdb5cfd830b9bada93063e331c43a0a34d78487080247c20e7c7766cc24 not found: ID does not exist" Apr 24 21:36:20.515276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.515205 2578 scope.go:117] "RemoveContainer" containerID="a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc" Apr 24 21:36:20.515417 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.515395 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc"} err="failed to get container status \"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc\": rpc error: code = NotFound desc = could not find container \"a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc\": container with ID starting with a467ff9ca8326abf3a829d6926aa30897fd00511c00710792b95fc53808cd1fc not found: ID does not exist" Apr 24 21:36:20.515477 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.515417 2578 scope.go:117] "RemoveContainer" containerID="7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d" Apr 24 21:36:20.515603 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.515583 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d"} err="failed to get container status \"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d\": rpc error: code = NotFound desc = could not find container \"7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d\": container with ID starting with 7312cdbae04b0ed441ffe63f7f9da1dc05137443eb8a1ed187f031b5c620c62d not found: ID does not exist" Apr 24 21:36:20.597031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.597002 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e860b44-c98e-448a-bd09-50c263cdec59-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:36:20.597031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.597027 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e860b44-c98e-448a-bd09-50c263cdec59-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:36:20.597187 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.597040 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hk82q\" (UniqueName: \"kubernetes.io/projected/1e860b44-c98e-448a-bd09-50c263cdec59-kube-api-access-hk82q\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:36:20.800079 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.800012 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq"] Apr 24 21:36:20.803983 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:20.803960 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-49tbq"] Apr 24 21:36:22.660058 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:22.660024 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" path="/var/lib/kubelet/pods/1e860b44-c98e-448a-bd09-50c263cdec59/volumes" Apr 24 21:36:24.499696 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:24.499659 2578 generic.go:358] "Generic (PLEG): container finished" podID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerID="90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95" exitCode=0 Apr 24 21:36:24.500069 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:24.499741 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" event={"ID":"3e2e15fa-9551-4bb0-ad39-859d391d652f","Type":"ContainerDied","Data":"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95"} Apr 24 21:36:25.505114 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:25.505079 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" event={"ID":"3e2e15fa-9551-4bb0-ad39-859d391d652f","Type":"ContainerStarted","Data":"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4"} Apr 24 21:36:25.505469 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:25.505120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" event={"ID":"3e2e15fa-9551-4bb0-ad39-859d391d652f","Type":"ContainerStarted","Data":"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d"} Apr 24 21:36:25.505469 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:25.505317 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:25.523967 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:25.523918 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" podStartSLOduration=6.523904057 podStartE2EDuration="6.523904057s" podCreationTimestamp="2026-04-24 21:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:25.521887299 +0000 UTC m=+1209.489689371" watchObservedRunningTime="2026-04-24 21:36:25.523904057 +0000 UTC m=+1209.491706119" Apr 24 21:36:26.508183 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:26.508148 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:26.509316 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:26.509288 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:36:27.511089 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:27.511051 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:36:32.515290 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:32.515259 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:32.516484 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:32.516460 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:39.499309 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.499271 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp"] Apr 24 21:36:39.501632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.499584 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kserve-container" containerID="cri-o://42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d" gracePeriod=30 Apr 24 21:36:39.501632 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.499630 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kube-rbac-proxy" containerID="cri-o://c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4" gracePeriod=30 Apr 24 21:36:39.567664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.567626 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98"] Apr 24 21:36:39.568040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.568023 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="kserve-container" Apr 24 21:36:39.568122 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.568042 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="kserve-container" Apr 24 21:36:39.568122 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.568054 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="kube-rbac-proxy" Apr 24 21:36:39.568122 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.568061 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="kube-rbac-proxy" Apr 24 21:36:39.568122 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.568078 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="storage-initializer" Apr 24 21:36:39.568122 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.568087 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="storage-initializer" Apr 24 21:36:39.568426 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.568191 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="kserve-container" Apr 24 21:36:39.568426 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.568209 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e860b44-c98e-448a-bd09-50c263cdec59" containerName="kube-rbac-proxy" Apr 24 21:36:39.571423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.571402 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.573524 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.573502 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 24 21:36:39.573635 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.573585 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:36:39.579869 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.579841 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98"] Apr 24 21:36:39.652705 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.652675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b73bc36-5c64-4d18-a267-1f53c0128ff4-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.652853 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.652726 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zzs\" (UniqueName: \"kubernetes.io/projected/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kube-api-access-j4zzs\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.652895 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.652845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.652930 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.652891 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b73bc36-5c64-4d18-a267-1f53c0128ff4-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.754255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.754162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b73bc36-5c64-4d18-a267-1f53c0128ff4-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.754255 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.754215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zzs\" (UniqueName: \"kubernetes.io/projected/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kube-api-access-j4zzs\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.754495 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.754276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.754495 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.754304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b73bc36-5c64-4d18-a267-1f53c0128ff4-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.754860 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.754832 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.755126 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.755105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b73bc36-5c64-4d18-a267-1f53c0128ff4-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.756961 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.756937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b73bc36-5c64-4d18-a267-1f53c0128ff4-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.762237 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.762211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zzs\" (UniqueName: \"kubernetes.io/projected/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kube-api-access-j4zzs\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:39.882314 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:39.882263 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:40.008266 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.008239 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98"] Apr 24 21:36:40.034887 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:36:40.034857 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b73bc36_5c64_4d18_a267_1f53c0128ff4.slice/crio-778a3682e7cff6d196dd2215910f215b7134eba60bd5fbb9bc885b3eb76674bc WatchSource:0}: Error finding container 778a3682e7cff6d196dd2215910f215b7134eba60bd5fbb9bc885b3eb76674bc: Status 404 returned error can't find the container with id 778a3682e7cff6d196dd2215910f215b7134eba60bd5fbb9bc885b3eb76674bc Apr 24 21:36:40.133071 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.133050 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:40.259527 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.259442 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbd8t\" (UniqueName: \"kubernetes.io/projected/3e2e15fa-9551-4bb0-ad39-859d391d652f-kube-api-access-wbd8t\") pod \"3e2e15fa-9551-4bb0-ad39-859d391d652f\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " Apr 24 21:36:40.259527 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.259486 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e2e15fa-9551-4bb0-ad39-859d391d652f-proxy-tls\") pod \"3e2e15fa-9551-4bb0-ad39-859d391d652f\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " Apr 24 21:36:40.259527 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.259522 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e2e15fa-9551-4bb0-ad39-859d391d652f-kserve-provision-location\") pod \"3e2e15fa-9551-4bb0-ad39-859d391d652f\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " Apr 24 21:36:40.259853 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.259710 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e2e15fa-9551-4bb0-ad39-859d391d652f-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"3e2e15fa-9551-4bb0-ad39-859d391d652f\" (UID: \"3e2e15fa-9551-4bb0-ad39-859d391d652f\") " Apr 24 21:36:40.259912 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.259855 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e2e15fa-9551-4bb0-ad39-859d391d652f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3e2e15fa-9551-4bb0-ad39-859d391d652f" (UID: "3e2e15fa-9551-4bb0-ad39-859d391d652f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:36:40.260040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.260013 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e2e15fa-9551-4bb0-ad39-859d391d652f-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "3e2e15fa-9551-4bb0-ad39-859d391d652f" (UID: "3e2e15fa-9551-4bb0-ad39-859d391d652f"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:40.260123 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.260077 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e2e15fa-9551-4bb0-ad39-859d391d652f-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:36:40.261631 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.261609 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2e15fa-9551-4bb0-ad39-859d391d652f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3e2e15fa-9551-4bb0-ad39-859d391d652f" (UID: "3e2e15fa-9551-4bb0-ad39-859d391d652f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:40.261713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.261669 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2e15fa-9551-4bb0-ad39-859d391d652f-kube-api-access-wbd8t" (OuterVolumeSpecName: "kube-api-access-wbd8t") pod "3e2e15fa-9551-4bb0-ad39-859d391d652f" (UID: "3e2e15fa-9551-4bb0-ad39-859d391d652f"). InnerVolumeSpecName "kube-api-access-wbd8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:40.360597 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.360570 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbd8t\" (UniqueName: \"kubernetes.io/projected/3e2e15fa-9551-4bb0-ad39-859d391d652f-kube-api-access-wbd8t\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:36:40.360597 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.360595 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e2e15fa-9551-4bb0-ad39-859d391d652f-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:36:40.360793 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.360606 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e2e15fa-9551-4bb0-ad39-859d391d652f-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:36:40.549539 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.549495 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" event={"ID":"0b73bc36-5c64-4d18-a267-1f53c0128ff4","Type":"ContainerStarted","Data":"dbda3314fd83d4099bdce931d1904cf97ec8243d4f7db76883ea25f81e628a12"} Apr 24 21:36:40.549539 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.549545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" event={"ID":"0b73bc36-5c64-4d18-a267-1f53c0128ff4","Type":"ContainerStarted","Data":"778a3682e7cff6d196dd2215910f215b7134eba60bd5fbb9bc885b3eb76674bc"} Apr 24 21:36:40.554164 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.554131 2578 generic.go:358] "Generic (PLEG): container finished" podID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerID="c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4" exitCode=2 Apr 24 21:36:40.554164 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.554160 2578 generic.go:358] "Generic (PLEG): container finished" podID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerID="42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d" exitCode=0 Apr 24 21:36:40.554368 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.554212 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" Apr 24 21:36:40.554368 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.554225 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" event={"ID":"3e2e15fa-9551-4bb0-ad39-859d391d652f","Type":"ContainerDied","Data":"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4"} Apr 24 21:36:40.554368 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.554257 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" event={"ID":"3e2e15fa-9551-4bb0-ad39-859d391d652f","Type":"ContainerDied","Data":"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d"} Apr 24 21:36:40.554368 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.554266 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp" event={"ID":"3e2e15fa-9551-4bb0-ad39-859d391d652f","Type":"ContainerDied","Data":"51e44ba0fa78c6e3f355b28b97c1085fc56dafa0956435518ea27aeb4ec19709"} Apr 24 21:36:40.554368 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.554279 2578 scope.go:117] "RemoveContainer" containerID="c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4" Apr 24 21:36:40.562770 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.562725 2578 scope.go:117] "RemoveContainer" containerID="42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d" Apr 24 21:36:40.571907 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.571890 2578 scope.go:117] "RemoveContainer" containerID="90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95" Apr 24 21:36:40.579175 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.579136 2578 scope.go:117] "RemoveContainer" containerID="c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4" Apr 24 21:36:40.579693 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:36:40.579664 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4\": container with ID starting with c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4 not found: ID does not exist" containerID="c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4" Apr 24 21:36:40.579838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.579705 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4"} err="failed to get container status \"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4\": rpc error: code = NotFound desc = could not find container \"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4\": container with ID starting with c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4 not found: ID does not exist" Apr 24 21:36:40.579838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.579735 2578 scope.go:117] "RemoveContainer" containerID="42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d" Apr 24 21:36:40.580168 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:36:40.580137 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d\": container with ID starting with 42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d not found: ID does not exist" containerID="42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d" Apr 24 21:36:40.580247 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.580178 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d"} err="failed to get container status \"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d\": rpc error: code = NotFound desc = could not find container \"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d\": container with ID starting with 42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d not found: ID does not exist" Apr 24 21:36:40.580247 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.580218 2578 scope.go:117] "RemoveContainer" containerID="90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95" Apr 24 21:36:40.580497 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:36:40.580475 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95\": container with ID starting with 90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95 not found: ID does not exist" containerID="90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95" Apr 24 21:36:40.580573 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.580506 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95"} err="failed to get container status \"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95\": rpc error: code = NotFound desc = could not find container \"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95\": container with ID starting with 90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95 not found: ID does not exist" Apr 24 21:36:40.580573 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.580530 2578 scope.go:117] "RemoveContainer" containerID="c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4" Apr 24 21:36:40.580902 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.580866 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4"} err="failed to get container status \"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4\": rpc error: code = NotFound desc = could not find container \"c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4\": container with ID starting with c40e808b35842a096ff34627970f03a3fd004eb9457d95f939f74e40d0b6b3e4 not found: ID does not exist" Apr 24 21:36:40.580902 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.580896 2578 scope.go:117] "RemoveContainer" containerID="42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d" Apr 24 21:36:40.581218 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.581197 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d"} err="failed to get container status \"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d\": rpc error: code = NotFound desc = could not find container \"42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d\": container with ID starting with 42d51cf5c4276d128f04db3bd8fa43e28df2b6a021f052b90c0a4cae0682a88d not found: ID does not exist" Apr 24 21:36:40.581295 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.581219 2578 scope.go:117] "RemoveContainer" containerID="90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95" Apr 24 21:36:40.581473 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.581446 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95"} err="failed to get container status \"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95\": rpc error: code = NotFound desc = could not find container \"90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95\": container with ID starting with 90eca81d3db30e0e00a3b4da52a654bd3aa35077393b8fa9962630a65a840f95 not found: ID does not exist" Apr 24 21:36:40.582035 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.582015 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp"] Apr 24 21:36:40.586761 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.586728 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-6c8zp"] Apr 24 21:36:40.663774 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:40.661823 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" path="/var/lib/kubelet/pods/3e2e15fa-9551-4bb0-ad39-859d391d652f/volumes" Apr 24 21:36:44.570870 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:44.570836 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerID="dbda3314fd83d4099bdce931d1904cf97ec8243d4f7db76883ea25f81e628a12" exitCode=0 Apr 24 21:36:44.571240 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:44.570911 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" event={"ID":"0b73bc36-5c64-4d18-a267-1f53c0128ff4","Type":"ContainerDied","Data":"dbda3314fd83d4099bdce931d1904cf97ec8243d4f7db76883ea25f81e628a12"} Apr 24 21:36:45.576076 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:45.576045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" event={"ID":"0b73bc36-5c64-4d18-a267-1f53c0128ff4","Type":"ContainerStarted","Data":"fd16f6667254f73156ecd41f2191bda234e460761a162d7ac028b26135191016"} Apr 24 21:36:45.576076 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:45.576078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" event={"ID":"0b73bc36-5c64-4d18-a267-1f53c0128ff4","Type":"ContainerStarted","Data":"a58254ff5526b0e3151970de6d80fceea2536839e4af114f2e3fb58aba68883d"} Apr 24 21:36:45.576465 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:45.576425 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:45.576465 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:45.576453 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:36:45.596935 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:45.596890 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" podStartSLOduration=6.596877891 podStartE2EDuration="6.596877891s" podCreationTimestamp="2026-04-24 21:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:45.595435894 +0000 UTC m=+1229.563237969" watchObservedRunningTime="2026-04-24 21:36:45.596877891 +0000 UTC m=+1229.564679953" Apr 24 21:36:51.586393 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:36:51.586368 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:37:21.590586 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:21.590552 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:37:29.600334 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.600296 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98"] Apr 24 21:37:29.600732 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.600637 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="kserve-container" containerID="cri-o://a58254ff5526b0e3151970de6d80fceea2536839e4af114f2e3fb58aba68883d" gracePeriod=30 Apr 24 21:37:29.600732 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.600683 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="kube-rbac-proxy" containerID="cri-o://fd16f6667254f73156ecd41f2191bda234e460761a162d7ac028b26135191016" gracePeriod=30 Apr 24 21:37:29.692302 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692258 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx"] Apr 24 21:37:29.692778 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692729 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kube-rbac-proxy" Apr 24 21:37:29.692778 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692774 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kube-rbac-proxy" Apr 24 21:37:29.692962 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692816 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="storage-initializer" Apr 24 21:37:29.692962 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692826 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="storage-initializer" Apr 24 21:37:29.692962 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692835 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kserve-container" Apr 24 21:37:29.692962 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692843 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kserve-container" Apr 24 21:37:29.692962 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692923 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kserve-container" Apr 24 21:37:29.692962 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.692938 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e2e15fa-9551-4bb0-ad39-859d391d652f" containerName="kube-rbac-proxy" Apr 24 21:37:29.697782 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.697761 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.702279 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.702259 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 24 21:37:29.702279 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.702269 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 24 21:37:29.710334 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.710304 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx"] Apr 24 21:37:29.727314 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.727290 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.727421 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.727334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.727421 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.727363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.727498 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.727434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l9gh\" (UniqueName: \"kubernetes.io/projected/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kube-api-access-6l9gh\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.828644 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.828604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l9gh\" (UniqueName: \"kubernetes.io/projected/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kube-api-access-6l9gh\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.828847 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.828650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.828847 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.828694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.828847 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.828741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.829010 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:37:29.828893 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-serving-cert: secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 24 21:37:29.829010 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:37:29.828971 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls podName:b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:30.328955113 +0000 UTC m=+1274.296757153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls") pod "isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" (UID: "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6") : secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 24 21:37:29.829207 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.829185 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.829489 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.829473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:29.841170 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:29.841145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l9gh\" (UniqueName: \"kubernetes.io/projected/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kube-api-access-6l9gh\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:30.332518 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.332483 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:30.335262 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.335232 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-544466b4d7-b7qtx\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:30.609939 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.609911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:30.716841 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.716812 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerID="fd16f6667254f73156ecd41f2191bda234e460761a162d7ac028b26135191016" exitCode=2 Apr 24 21:37:30.716841 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.716837 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerID="a58254ff5526b0e3151970de6d80fceea2536839e4af114f2e3fb58aba68883d" exitCode=0 Apr 24 21:37:30.717069 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.716927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" event={"ID":"0b73bc36-5c64-4d18-a267-1f53c0128ff4","Type":"ContainerDied","Data":"fd16f6667254f73156ecd41f2191bda234e460761a162d7ac028b26135191016"} Apr 24 21:37:30.717069 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.716953 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" event={"ID":"0b73bc36-5c64-4d18-a267-1f53c0128ff4","Type":"ContainerDied","Data":"a58254ff5526b0e3151970de6d80fceea2536839e4af114f2e3fb58aba68883d"} Apr 24 21:37:30.763850 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.763819 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx"] Apr 24 21:37:30.764433 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.764416 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:37:30.766061 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:37:30.766039 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4f7ee0d_b7e5_46e8_871f_b473bc15b6a6.slice/crio-d68c45744a802a320a96219bcc36fca0aac37bf3305be58eeaaeeb7102854e64 WatchSource:0}: Error finding container d68c45744a802a320a96219bcc36fca0aac37bf3305be58eeaaeeb7102854e64: Status 404 returned error can't find the container with id d68c45744a802a320a96219bcc36fca0aac37bf3305be58eeaaeeb7102854e64 Apr 24 21:37:30.837609 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.837583 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4zzs\" (UniqueName: \"kubernetes.io/projected/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kube-api-access-j4zzs\") pod \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " Apr 24 21:37:30.837692 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.837629 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b73bc36-5c64-4d18-a267-1f53c0128ff4-proxy-tls\") pod \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " Apr 24 21:37:30.837786 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.837690 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b73bc36-5c64-4d18-a267-1f53c0128ff4-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " Apr 24 21:37:30.837850 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.837793 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kserve-provision-location\") pod \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\" (UID: \"0b73bc36-5c64-4d18-a267-1f53c0128ff4\") " Apr 24 21:37:30.838119 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.838093 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b73bc36-5c64-4d18-a267-1f53c0128ff4-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "0b73bc36-5c64-4d18-a267-1f53c0128ff4" (UID: "0b73bc36-5c64-4d18-a267-1f53c0128ff4"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:30.838349 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.838251 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b73bc36-5c64-4d18-a267-1f53c0128ff4" (UID: "0b73bc36-5c64-4d18-a267-1f53c0128ff4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:30.839555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.839536 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kube-api-access-j4zzs" (OuterVolumeSpecName: "kube-api-access-j4zzs") pod "0b73bc36-5c64-4d18-a267-1f53c0128ff4" (UID: "0b73bc36-5c64-4d18-a267-1f53c0128ff4"). InnerVolumeSpecName "kube-api-access-j4zzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:30.839699 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.839679 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b73bc36-5c64-4d18-a267-1f53c0128ff4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b73bc36-5c64-4d18-a267-1f53c0128ff4" (UID: "0b73bc36-5c64-4d18-a267-1f53c0128ff4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:30.939372 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.939337 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b73bc36-5c64-4d18-a267-1f53c0128ff4-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:37:30.939534 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.939383 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:37:30.939534 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.939399 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4zzs\" (UniqueName: \"kubernetes.io/projected/0b73bc36-5c64-4d18-a267-1f53c0128ff4-kube-api-access-j4zzs\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:37:30.939534 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:30.939413 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b73bc36-5c64-4d18-a267-1f53c0128ff4-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:37:31.721429 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.721396 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" Apr 24 21:37:31.721863 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.721399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98" event={"ID":"0b73bc36-5c64-4d18-a267-1f53c0128ff4","Type":"ContainerDied","Data":"778a3682e7cff6d196dd2215910f215b7134eba60bd5fbb9bc885b3eb76674bc"} Apr 24 21:37:31.721863 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.721528 2578 scope.go:117] "RemoveContainer" containerID="fd16f6667254f73156ecd41f2191bda234e460761a162d7ac028b26135191016" Apr 24 21:37:31.722877 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.722838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerStarted","Data":"1419323bed84db0ef11516099322a4ab0bff1337d9f7e6e49578d1d22561a396"} Apr 24 21:37:31.722877 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.722876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerStarted","Data":"d68c45744a802a320a96219bcc36fca0aac37bf3305be58eeaaeeb7102854e64"} Apr 24 21:37:31.730230 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.730214 2578 scope.go:117] "RemoveContainer" containerID="a58254ff5526b0e3151970de6d80fceea2536839e4af114f2e3fb58aba68883d" Apr 24 21:37:31.737913 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.737897 2578 scope.go:117] "RemoveContainer" containerID="dbda3314fd83d4099bdce931d1904cf97ec8243d4f7db76883ea25f81e628a12" Apr 24 21:37:31.799062 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.799030 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98"] Apr 24 21:37:31.808355 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:31.808333 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cbq98"] Apr 24 21:37:32.660008 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:32.659975 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" path="/var/lib/kubelet/pods/0b73bc36-5c64-4d18-a267-1f53c0128ff4/volumes" Apr 24 21:37:34.732899 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:34.732865 2578 generic.go:358] "Generic (PLEG): container finished" podID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerID="1419323bed84db0ef11516099322a4ab0bff1337d9f7e6e49578d1d22561a396" exitCode=0 Apr 24 21:37:34.733246 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:34.732942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerDied","Data":"1419323bed84db0ef11516099322a4ab0bff1337d9f7e6e49578d1d22561a396"} Apr 24 21:37:35.738194 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:35.738155 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerStarted","Data":"27c1d6ed062b1ea25304c61b6d204035afd55d2555f36d5a0366683431a3ebe5"} Apr 24 21:37:37.746464 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:37.746426 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerStarted","Data":"90927ca0eda7a0dc53ad8a718c40718e2b7c0d8c17bdced3d37a2971059e02d1"} Apr 24 21:37:37.746915 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:37.746467 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerStarted","Data":"315cf334817f9c2fa065b88523b17dcd08b228f2611c32b7667dc7b3351d9035"} Apr 24 21:37:37.746915 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:37.746602 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:37.746915 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:37.746733 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:37.766453 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:37.766399 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podStartSLOduration=6.269867695 podStartE2EDuration="8.76638681s" podCreationTimestamp="2026-04-24 21:37:29 +0000 UTC" firstStartedPulling="2026-04-24 21:37:34.799150746 +0000 UTC m=+1278.766952786" lastFinishedPulling="2026-04-24 21:37:37.295669857 +0000 UTC m=+1281.263471901" observedRunningTime="2026-04-24 21:37:37.764335235 +0000 UTC m=+1281.732137299" watchObservedRunningTime="2026-04-24 21:37:37.76638681 +0000 UTC m=+1281.734188870" Apr 24 21:37:38.749874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:38.749844 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:37:44.758397 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:37:44.758369 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:38:04.760727 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:04.760698 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:38:44.761636 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:44.761557 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:38:49.788003 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.787971 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx"] Apr 24 21:38:49.788442 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.788291 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-container" containerID="cri-o://27c1d6ed062b1ea25304c61b6d204035afd55d2555f36d5a0366683431a3ebe5" gracePeriod=30 Apr 24 21:38:49.788442 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.788330 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" containerID="cri-o://90927ca0eda7a0dc53ad8a718c40718e2b7c0d8c17bdced3d37a2971059e02d1" gracePeriod=30 Apr 24 21:38:49.788442 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.788339 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-agent" containerID="cri-o://315cf334817f9c2fa065b88523b17dcd08b228f2611c32b7667dc7b3351d9035" gracePeriod=30 Apr 24 21:38:49.855169 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855137 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9"] Apr 24 21:38:49.855628 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855608 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="storage-initializer" Apr 24 21:38:49.855628 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855631 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="storage-initializer" Apr 24 21:38:49.855792 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855648 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="kube-rbac-proxy" Apr 24 21:38:49.855792 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855658 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="kube-rbac-proxy" Apr 24 21:38:49.855792 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855675 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="kserve-container" Apr 24 21:38:49.855792 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855683 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="kserve-container" Apr 24 21:38:49.855792 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855776 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="kserve-container" Apr 24 21:38:49.855792 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.855789 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b73bc36-5c64-4d18-a267-1f53c0128ff4" containerName="kube-rbac-proxy" Apr 24 21:38:49.858994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.858978 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:49.861167 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.861145 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 24 21:38:49.861299 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.861278 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 24 21:38:49.872239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.872215 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9"] Apr 24 21:38:49.945100 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.945069 2578 generic.go:358] "Generic (PLEG): container finished" podID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerID="90927ca0eda7a0dc53ad8a718c40718e2b7c0d8c17bdced3d37a2971059e02d1" exitCode=2 Apr 24 21:38:49.945247 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:49.945147 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerDied","Data":"90927ca0eda7a0dc53ad8a718c40718e2b7c0d8c17bdced3d37a2971059e02d1"} Apr 24 21:38:50.037472 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.037439 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5mj\" (UniqueName: \"kubernetes.io/projected/9a0f8898-d553-43d5-82c8-bd3607358293-kube-api-access-nw5mj\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.037606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.037514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a0f8898-d553-43d5-82c8-bd3607358293-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.037606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.037599 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0f8898-d553-43d5-82c8-bd3607358293-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.037679 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.037626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a0f8898-d553-43d5-82c8-bd3607358293-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.138153 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.138084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5mj\" (UniqueName: \"kubernetes.io/projected/9a0f8898-d553-43d5-82c8-bd3607358293-kube-api-access-nw5mj\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.138153 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.138126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a0f8898-d553-43d5-82c8-bd3607358293-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.138342 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.138165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0f8898-d553-43d5-82c8-bd3607358293-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.138342 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.138286 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a0f8898-d553-43d5-82c8-bd3607358293-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.138549 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.138529 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0f8898-d553-43d5-82c8-bd3607358293-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.138838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.138818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a0f8898-d553-43d5-82c8-bd3607358293-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.140957 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.140936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a0f8898-d553-43d5-82c8-bd3607358293-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.147057 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.147034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5mj\" (UniqueName: \"kubernetes.io/projected/9a0f8898-d553-43d5-82c8-bd3607358293-kube-api-access-nw5mj\") pod \"isvc-paddle-predictor-6b8b7cfb4b-n6ht9\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.168695 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.168671 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:38:50.291206 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.291179 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9"] Apr 24 21:38:50.292996 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:38:50.292970 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0f8898_d553_43d5_82c8_bd3607358293.slice/crio-75cdb038ee6e25ef875bd2cde3a5b11525860b127482d33617a798eea9e90bcd WatchSource:0}: Error finding container 75cdb038ee6e25ef875bd2cde3a5b11525860b127482d33617a798eea9e90bcd: Status 404 returned error can't find the container with id 75cdb038ee6e25ef875bd2cde3a5b11525860b127482d33617a798eea9e90bcd Apr 24 21:38:50.294738 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.294722 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:38:50.949664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.949624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" event={"ID":"9a0f8898-d553-43d5-82c8-bd3607358293","Type":"ContainerStarted","Data":"fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522"} Apr 24 21:38:50.949664 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:50.949666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" event={"ID":"9a0f8898-d553-43d5-82c8-bd3607358293","Type":"ContainerStarted","Data":"75cdb038ee6e25ef875bd2cde3a5b11525860b127482d33617a798eea9e90bcd"} Apr 24 21:38:51.955220 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:51.955187 2578 generic.go:358] "Generic (PLEG): container finished" podID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerID="27c1d6ed062b1ea25304c61b6d204035afd55d2555f36d5a0366683431a3ebe5" exitCode=0 Apr 24 21:38:51.955591 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:51.955266 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerDied","Data":"27c1d6ed062b1ea25304c61b6d204035afd55d2555f36d5a0366683431a3ebe5"} Apr 24 21:38:54.753687 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:54.753655 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 21:38:54.759010 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:54.758987 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.40:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:38:54.964666 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:54.964590 2578 generic.go:358] "Generic (PLEG): container finished" podID="9a0f8898-d553-43d5-82c8-bd3607358293" containerID="fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522" exitCode=0 Apr 24 21:38:54.964826 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:54.964666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" event={"ID":"9a0f8898-d553-43d5-82c8-bd3607358293","Type":"ContainerDied","Data":"fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522"} Apr 24 21:38:59.754136 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:38:59.754083 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 21:39:04.753488 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:04.753444 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 21:39:04.753916 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:04.753593 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:39:04.759051 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:04.759023 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.40:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:39:06.003031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:06.002934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" event={"ID":"9a0f8898-d553-43d5-82c8-bd3607358293","Type":"ContainerStarted","Data":"a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0"} Apr 24 21:39:06.003031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:06.002993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" event={"ID":"9a0f8898-d553-43d5-82c8-bd3607358293","Type":"ContainerStarted","Data":"22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d"} Apr 24 21:39:06.003386 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:06.003245 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:39:06.022401 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:06.021950 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podStartSLOduration=6.250311445 podStartE2EDuration="17.021937966s" podCreationTimestamp="2026-04-24 21:38:49 +0000 UTC" firstStartedPulling="2026-04-24 21:38:54.965848028 +0000 UTC m=+1358.933650071" lastFinishedPulling="2026-04-24 21:39:05.737474542 +0000 UTC m=+1369.705276592" observedRunningTime="2026-04-24 21:39:06.020846418 +0000 UTC m=+1369.988648479" watchObservedRunningTime="2026-04-24 21:39:06.021937966 +0000 UTC m=+1369.989740028" Apr 24 21:39:07.006206 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:07.006177 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:39:07.007325 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:07.007297 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:08.009483 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:08.009441 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:09.754466 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:09.754425 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 21:39:13.014244 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:13.014180 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:39:13.014795 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:13.014740 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:14.753697 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:14.753660 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 21:39:14.759074 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:14.759048 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.40:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:39:14.759167 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:14.759154 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:39:19.754174 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:19.754130 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 21:39:20.048669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.048575 2578 generic.go:358] "Generic (PLEG): container finished" podID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerID="315cf334817f9c2fa065b88523b17dcd08b228f2611c32b7667dc7b3351d9035" exitCode=137 Apr 24 21:39:20.048669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.048654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerDied","Data":"315cf334817f9c2fa065b88523b17dcd08b228f2611c32b7667dc7b3351d9035"} Apr 24 21:39:20.073990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.073970 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:39:20.189932 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.189904 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kserve-provision-location\") pod \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " Apr 24 21:39:20.190095 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.189976 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l9gh\" (UniqueName: \"kubernetes.io/projected/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kube-api-access-6l9gh\") pod \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " Apr 24 21:39:20.190095 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.190040 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " Apr 24 21:39:20.190215 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.190103 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls\") pod \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\" (UID: \"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6\") " Apr 24 21:39:20.190283 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.190254 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" (UID: "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:39:20.190338 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.190325 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:39:20.190406 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.190384 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" (UID: "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:20.192170 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.192148 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" (UID: "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:20.192272 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.192257 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kube-api-access-6l9gh" (OuterVolumeSpecName: "kube-api-access-6l9gh") pod "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" (UID: "b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6"). InnerVolumeSpecName "kube-api-access-6l9gh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:20.291378 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.291350 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:39:20.291378 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.291375 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:39:20.291378 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:20.291385 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6l9gh\" (UniqueName: \"kubernetes.io/projected/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6-kube-api-access-6l9gh\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:39:21.053445 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:21.053368 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" Apr 24 21:39:21.053831 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:21.053365 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx" event={"ID":"b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6","Type":"ContainerDied","Data":"d68c45744a802a320a96219bcc36fca0aac37bf3305be58eeaaeeb7102854e64"} Apr 24 21:39:21.053831 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:21.053491 2578 scope.go:117] "RemoveContainer" containerID="90927ca0eda7a0dc53ad8a718c40718e2b7c0d8c17bdced3d37a2971059e02d1" Apr 24 21:39:21.061310 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:21.061288 2578 scope.go:117] "RemoveContainer" containerID="315cf334817f9c2fa065b88523b17dcd08b228f2611c32b7667dc7b3351d9035" Apr 24 21:39:21.068247 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:21.068229 2578 scope.go:117] "RemoveContainer" containerID="27c1d6ed062b1ea25304c61b6d204035afd55d2555f36d5a0366683431a3ebe5" Apr 24 21:39:21.073963 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:21.073940 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx"] Apr 24 21:39:21.075470 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:21.075449 2578 scope.go:117] "RemoveContainer" containerID="1419323bed84db0ef11516099322a4ab0bff1337d9f7e6e49578d1d22561a396" Apr 24 21:39:21.078241 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:21.078218 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-544466b4d7-b7qtx"] Apr 24 21:39:22.659906 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:22.659873 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" path="/var/lib/kubelet/pods/b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6/volumes" Apr 24 21:39:23.014731 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:23.014645 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:33.015618 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:33.015573 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:43.014933 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:43.014893 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:53.014903 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:39:53.014869 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:40:01.262931 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.262846 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9"] Apr 24 21:40:01.264046 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.264013 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" containerID="cri-o://22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d" gracePeriod=30 Apr 24 21:40:01.264556 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.264531 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kube-rbac-proxy" containerID="cri-o://a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0" gracePeriod=30 Apr 24 21:40:01.336937 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.336908 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw"] Apr 24 21:40:01.337346 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337328 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-agent" Apr 24 21:40:01.337462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337347 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-agent" Apr 24 21:40:01.337462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337377 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-container" Apr 24 21:40:01.337462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337385 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-container" Apr 24 21:40:01.337462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337398 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="storage-initializer" Apr 24 21:40:01.337462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337407 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="storage-initializer" Apr 24 21:40:01.337462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337417 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" Apr 24 21:40:01.337462 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337424 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" Apr 24 21:40:01.337839 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337537 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kube-rbac-proxy" Apr 24 21:40:01.337839 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337554 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-agent" Apr 24 21:40:01.337839 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.337565 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4f7ee0d-b7e5-46e8-871f-b473bc15b6a6" containerName="kserve-container" Apr 24 21:40:01.341071 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.341054 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.343292 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.343273 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:40:01.343388 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.343371 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 24 21:40:01.350360 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.350341 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw"] Apr 24 21:40:01.394110 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.394083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5802b461-0b4c-44b4-8e35-9ec2476f7962-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.394214 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.394115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.394214 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.394133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5802b461-0b4c-44b4-8e35-9ec2476f7962-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.394327 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.394304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r967h\" (UniqueName: \"kubernetes.io/projected/5802b461-0b4c-44b4-8e35-9ec2476f7962-kube-api-access-r967h\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.495552 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.495519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5802b461-0b4c-44b4-8e35-9ec2476f7962-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.495552 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.495556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.495803 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.495576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5802b461-0b4c-44b4-8e35-9ec2476f7962-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.495803 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.495641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r967h\" (UniqueName: \"kubernetes.io/projected/5802b461-0b4c-44b4-8e35-9ec2476f7962-kube-api-access-r967h\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.495803 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:40:01.495708 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-runtime-predictor-serving-cert: secret "isvc-paddle-runtime-predictor-serving-cert" not found Apr 24 21:40:01.495972 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:40:01.495810 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls podName:5802b461-0b4c-44b4-8e35-9ec2476f7962 nodeName:}" failed. No retries permitted until 2026-04-24 21:40:01.995785468 +0000 UTC m=+1425.963587526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls") pod "isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" (UID: "5802b461-0b4c-44b4-8e35-9ec2476f7962") : secret "isvc-paddle-runtime-predictor-serving-cert" not found Apr 24 21:40:01.495972 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.495942 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5802b461-0b4c-44b4-8e35-9ec2476f7962-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.496258 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.496237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5802b461-0b4c-44b4-8e35-9ec2476f7962-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:01.504836 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:01.504813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r967h\" (UniqueName: \"kubernetes.io/projected/5802b461-0b4c-44b4-8e35-9ec2476f7962-kube-api-access-r967h\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:02.000966 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:02.000923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:02.003493 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:02.003471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:02.179010 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:02.178969 2578 generic.go:358] "Generic (PLEG): container finished" podID="9a0f8898-d553-43d5-82c8-bd3607358293" containerID="a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0" exitCode=2 Apr 24 21:40:02.179174 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:02.179041 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" event={"ID":"9a0f8898-d553-43d5-82c8-bd3607358293","Type":"ContainerDied","Data":"a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0"} Apr 24 21:40:02.251565 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:02.251463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:02.386736 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:02.386562 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw"] Apr 24 21:40:02.389301 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:40:02.389274 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5802b461_0b4c_44b4_8e35_9ec2476f7962.slice/crio-558364ebbc84af8d378563f35e7725ae4e41c12aa83b14867bd97b08483dd1b9 WatchSource:0}: Error finding container 558364ebbc84af8d378563f35e7725ae4e41c12aa83b14867bd97b08483dd1b9: Status 404 returned error can't find the container with id 558364ebbc84af8d378563f35e7725ae4e41c12aa83b14867bd97b08483dd1b9 Apr 24 21:40:03.009898 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.009847 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.41:8643/healthz\": dial tcp 10.134.0.41:8643: connect: connection refused" Apr 24 21:40:03.015275 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.015248 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:40:03.183158 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.183126 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" event={"ID":"5802b461-0b4c-44b4-8e35-9ec2476f7962","Type":"ContainerStarted","Data":"9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2"} Apr 24 21:40:03.183158 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.183165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" event={"ID":"5802b461-0b4c-44b4-8e35-9ec2476f7962","Type":"ContainerStarted","Data":"558364ebbc84af8d378563f35e7725ae4e41c12aa83b14867bd97b08483dd1b9"} Apr 24 21:40:03.906906 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.906880 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:40:03.917515 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.917495 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a0f8898-d553-43d5-82c8-bd3607358293-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"9a0f8898-d553-43d5-82c8-bd3607358293\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " Apr 24 21:40:03.917604 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.917545 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0f8898-d553-43d5-82c8-bd3607358293-kserve-provision-location\") pod \"9a0f8898-d553-43d5-82c8-bd3607358293\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " Apr 24 21:40:03.917604 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.917573 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a0f8898-d553-43d5-82c8-bd3607358293-proxy-tls\") pod \"9a0f8898-d553-43d5-82c8-bd3607358293\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " Apr 24 21:40:03.917604 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.917593 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw5mj\" (UniqueName: \"kubernetes.io/projected/9a0f8898-d553-43d5-82c8-bd3607358293-kube-api-access-nw5mj\") pod \"9a0f8898-d553-43d5-82c8-bd3607358293\" (UID: \"9a0f8898-d553-43d5-82c8-bd3607358293\") " Apr 24 21:40:03.917836 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.917815 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0f8898-d553-43d5-82c8-bd3607358293-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "9a0f8898-d553-43d5-82c8-bd3607358293" (UID: "9a0f8898-d553-43d5-82c8-bd3607358293"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:40:03.919713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.919683 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0f8898-d553-43d5-82c8-bd3607358293-kube-api-access-nw5mj" (OuterVolumeSpecName: "kube-api-access-nw5mj") pod "9a0f8898-d553-43d5-82c8-bd3607358293" (UID: "9a0f8898-d553-43d5-82c8-bd3607358293"). InnerVolumeSpecName "kube-api-access-nw5mj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:03.919819 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.919720 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0f8898-d553-43d5-82c8-bd3607358293-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a0f8898-d553-43d5-82c8-bd3607358293" (UID: "9a0f8898-d553-43d5-82c8-bd3607358293"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:03.927262 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:03.927228 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a0f8898-d553-43d5-82c8-bd3607358293-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9a0f8898-d553-43d5-82c8-bd3607358293" (UID: "9a0f8898-d553-43d5-82c8-bd3607358293"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:04.018209 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.018129 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0f8898-d553-43d5-82c8-bd3607358293-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:40:04.018209 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.018156 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a0f8898-d553-43d5-82c8-bd3607358293-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:40:04.018209 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.018166 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nw5mj\" (UniqueName: \"kubernetes.io/projected/9a0f8898-d553-43d5-82c8-bd3607358293-kube-api-access-nw5mj\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:40:04.018209 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.018179 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a0f8898-d553-43d5-82c8-bd3607358293-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:40:04.188149 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.188119 2578 generic.go:358] "Generic (PLEG): container finished" podID="9a0f8898-d553-43d5-82c8-bd3607358293" containerID="22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d" exitCode=0 Apr 24 21:40:04.188305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.188203 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" Apr 24 21:40:04.188305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.188205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" event={"ID":"9a0f8898-d553-43d5-82c8-bd3607358293","Type":"ContainerDied","Data":"22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d"} Apr 24 21:40:04.188305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.188249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9" event={"ID":"9a0f8898-d553-43d5-82c8-bd3607358293","Type":"ContainerDied","Data":"75cdb038ee6e25ef875bd2cde3a5b11525860b127482d33617a798eea9e90bcd"} Apr 24 21:40:04.188305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.188268 2578 scope.go:117] "RemoveContainer" containerID="a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0" Apr 24 21:40:04.196410 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.196392 2578 scope.go:117] "RemoveContainer" containerID="22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d" Apr 24 21:40:04.203516 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.203500 2578 scope.go:117] "RemoveContainer" containerID="fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522" Apr 24 21:40:04.210613 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.210597 2578 scope.go:117] "RemoveContainer" containerID="a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0" Apr 24 21:40:04.210952 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:40:04.210933 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0\": container with ID starting with a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0 not found: ID does not exist" containerID="a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0" Apr 24 21:40:04.211031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.210960 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0"} err="failed to get container status \"a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0\": rpc error: code = NotFound desc = could not find container \"a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0\": container with ID starting with a36f8caeef87916cc3b8fa206a462ad0bc5f6070e76cbd39d775accd9f89c6b0 not found: ID does not exist" Apr 24 21:40:04.211031 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.210977 2578 scope.go:117] "RemoveContainer" containerID="22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d" Apr 24 21:40:04.211194 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:40:04.211172 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d\": container with ID starting with 22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d not found: ID does not exist" containerID="22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d" Apr 24 21:40:04.211242 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.211202 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d"} err="failed to get container status \"22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d\": rpc error: code = NotFound desc = could not find container \"22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d\": container with ID starting with 22b76f04cc26f58a31ab66a969d541a27b005e5c277da4d7dd21379919db3d6d not found: ID does not exist" Apr 24 21:40:04.211242 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.211219 2578 scope.go:117] "RemoveContainer" containerID="fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522" Apr 24 21:40:04.211451 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:40:04.211432 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522\": container with ID starting with fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522 not found: ID does not exist" containerID="fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522" Apr 24 21:40:04.211510 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.211459 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522"} err="failed to get container status \"fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522\": rpc error: code = NotFound desc = could not find container \"fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522\": container with ID starting with fce3d7fc1ad11a3f4721d3d008b62522f1cdf3bcffae1c5b75accb6ae7f3d522 not found: ID does not exist" Apr 24 21:40:04.211999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.211981 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9"] Apr 24 21:40:04.215163 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.215143 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-n6ht9"] Apr 24 21:40:04.660299 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:04.660268 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" path="/var/lib/kubelet/pods/9a0f8898-d553-43d5-82c8-bd3607358293/volumes" Apr 24 21:40:07.200400 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:07.200365 2578 generic.go:358] "Generic (PLEG): container finished" podID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerID="9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2" exitCode=0 Apr 24 21:40:07.200852 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:07.200421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" event={"ID":"5802b461-0b4c-44b4-8e35-9ec2476f7962","Type":"ContainerDied","Data":"9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2"} Apr 24 21:40:08.205499 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:08.205466 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" event={"ID":"5802b461-0b4c-44b4-8e35-9ec2476f7962","Type":"ContainerStarted","Data":"6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b"} Apr 24 21:40:08.205499 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:08.205505 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" event={"ID":"5802b461-0b4c-44b4-8e35-9ec2476f7962","Type":"ContainerStarted","Data":"48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6"} Apr 24 21:40:08.205910 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:08.205692 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:08.232341 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:08.232297 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podStartSLOduration=7.232284477 podStartE2EDuration="7.232284477s" podCreationTimestamp="2026-04-24 21:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:08.229406189 +0000 UTC m=+1432.197208251" watchObservedRunningTime="2026-04-24 21:40:08.232284477 +0000 UTC m=+1432.200086539" Apr 24 21:40:09.209004 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:09.208971 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:09.210158 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:09.210131 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:40:10.211997 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:10.211961 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:40:15.216266 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:15.216238 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:40:15.216849 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:15.216817 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:40:25.217795 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:25.217729 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:40:35.217418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:35.217374 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:40:45.216804 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:45.216743 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:40:55.217496 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:40:55.217467 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:41:02.692800 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.692767 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw"] Apr 24 21:41:02.693249 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.693201 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" containerID="cri-o://48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6" gracePeriod=30 Apr 24 21:41:02.693319 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.693205 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kube-rbac-proxy" containerID="cri-o://6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b" gracePeriod=30 Apr 24 21:41:02.777638 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.777606 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp"] Apr 24 21:41:02.777970 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.777957 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kube-rbac-proxy" Apr 24 21:41:02.778019 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.777973 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kube-rbac-proxy" Apr 24 21:41:02.778019 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.777989 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="storage-initializer" Apr 24 21:41:02.778019 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.777994 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="storage-initializer" Apr 24 21:41:02.778019 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.778007 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" Apr 24 21:41:02.778019 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.778012 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" Apr 24 21:41:02.778169 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.778066 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kube-rbac-proxy" Apr 24 21:41:02.778169 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.778075 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a0f8898-d553-43d5-82c8-bd3607358293" containerName="kserve-container" Apr 24 21:41:02.781358 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.781340 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:02.783705 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.783685 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:41:02.783847 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.783830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 24 21:41:02.789625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.789600 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp"] Apr 24 21:41:02.958780 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.958682 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4da6e542-bfed-4eab-80b4-83d5baba8bb9-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:02.958780 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.958741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:02.958955 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.958783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:02.958955 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:02.958826 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj2s4\" (UniqueName: \"kubernetes.io/projected/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kube-api-access-pj2s4\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.060045 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.060015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4da6e542-bfed-4eab-80b4-83d5baba8bb9-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.060175 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.060067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.060240 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.060183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.060294 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.060231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pj2s4\" (UniqueName: \"kubernetes.io/projected/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kube-api-access-pj2s4\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.060362 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:41:03.060338 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-serving-cert: secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 24 21:41:03.060424 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.060406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.060484 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:41:03.060438 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls podName:4da6e542-bfed-4eab-80b4-83d5baba8bb9 nodeName:}" failed. No retries permitted until 2026-04-24 21:41:03.560415253 +0000 UTC m=+1487.528217295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls") pod "isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" (UID: "4da6e542-bfed-4eab-80b4-83d5baba8bb9") : secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 24 21:41:03.060640 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.060620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4da6e542-bfed-4eab-80b4-83d5baba8bb9-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.079426 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.079400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj2s4\" (UniqueName: \"kubernetes.io/projected/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kube-api-access-pj2s4\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.371522 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.371438 2578 generic.go:358] "Generic (PLEG): container finished" podID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerID="6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b" exitCode=2 Apr 24 21:41:03.371522 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.371506 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" event={"ID":"5802b461-0b4c-44b4-8e35-9ec2476f7962","Type":"ContainerDied","Data":"6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b"} Apr 24 21:41:03.564974 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.564942 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.567484 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.567451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.692195 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.692170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:03.815248 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:03.815220 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp"] Apr 24 21:41:03.818027 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:41:03.818000 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4da6e542_bfed_4eab_80b4_83d5baba8bb9.slice/crio-f0957c355dbe3bbc418853ebc2740ab4e9f7d1c847cdfdc62c2aedc12c0e4efc WatchSource:0}: Error finding container f0957c355dbe3bbc418853ebc2740ab4e9f7d1c847cdfdc62c2aedc12c0e4efc: Status 404 returned error can't find the container with id f0957c355dbe3bbc418853ebc2740ab4e9f7d1c847cdfdc62c2aedc12c0e4efc Apr 24 21:41:04.375341 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:04.375303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" event={"ID":"4da6e542-bfed-4eab-80b4-83d5baba8bb9","Type":"ContainerStarted","Data":"a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed"} Apr 24 21:41:04.375506 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:04.375347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" event={"ID":"4da6e542-bfed-4eab-80b4-83d5baba8bb9","Type":"ContainerStarted","Data":"f0957c355dbe3bbc418853ebc2740ab4e9f7d1c847cdfdc62c2aedc12c0e4efc"} Apr 24 21:41:05.233058 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.233035 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:41:05.379233 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.379172 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r967h\" (UniqueName: \"kubernetes.io/projected/5802b461-0b4c-44b4-8e35-9ec2476f7962-kube-api-access-r967h\") pod \"5802b461-0b4c-44b4-8e35-9ec2476f7962\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " Apr 24 21:41:05.379233 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.379220 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5802b461-0b4c-44b4-8e35-9ec2476f7962-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"5802b461-0b4c-44b4-8e35-9ec2476f7962\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " Apr 24 21:41:05.379412 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.379275 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls\") pod \"5802b461-0b4c-44b4-8e35-9ec2476f7962\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " Apr 24 21:41:05.379412 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.379368 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5802b461-0b4c-44b4-8e35-9ec2476f7962-kserve-provision-location\") pod \"5802b461-0b4c-44b4-8e35-9ec2476f7962\" (UID: \"5802b461-0b4c-44b4-8e35-9ec2476f7962\") " Apr 24 21:41:05.379704 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.379669 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5802b461-0b4c-44b4-8e35-9ec2476f7962-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "5802b461-0b4c-44b4-8e35-9ec2476f7962" (UID: "5802b461-0b4c-44b4-8e35-9ec2476f7962"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:41:05.379933 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.379902 2578 generic.go:358] "Generic (PLEG): container finished" podID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerID="48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6" exitCode=0 Apr 24 21:41:05.380062 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.380029 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" event={"ID":"5802b461-0b4c-44b4-8e35-9ec2476f7962","Type":"ContainerDied","Data":"48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6"} Apr 24 21:41:05.380179 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.380082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" event={"ID":"5802b461-0b4c-44b4-8e35-9ec2476f7962","Type":"ContainerDied","Data":"558364ebbc84af8d378563f35e7725ae4e41c12aa83b14867bd97b08483dd1b9"} Apr 24 21:41:05.380179 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.380100 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" Apr 24 21:41:05.380179 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.380102 2578 scope.go:117] "RemoveContainer" containerID="6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b" Apr 24 21:41:05.381620 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.381594 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5802b461-0b4c-44b4-8e35-9ec2476f7962" (UID: "5802b461-0b4c-44b4-8e35-9ec2476f7962"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:41:05.381888 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.381869 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5802b461-0b4c-44b4-8e35-9ec2476f7962-kube-api-access-r967h" (OuterVolumeSpecName: "kube-api-access-r967h") pod "5802b461-0b4c-44b4-8e35-9ec2476f7962" (UID: "5802b461-0b4c-44b4-8e35-9ec2476f7962"). InnerVolumeSpecName "kube-api-access-r967h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:41:05.389495 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.389471 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5802b461-0b4c-44b4-8e35-9ec2476f7962-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5802b461-0b4c-44b4-8e35-9ec2476f7962" (UID: "5802b461-0b4c-44b4-8e35-9ec2476f7962"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:41:05.395824 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.395809 2578 scope.go:117] "RemoveContainer" containerID="48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6" Apr 24 21:41:05.402513 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.402499 2578 scope.go:117] "RemoveContainer" containerID="9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2" Apr 24 21:41:05.409097 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.409079 2578 scope.go:117] "RemoveContainer" containerID="6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b" Apr 24 21:41:05.409322 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:41:05.409306 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b\": container with ID starting with 6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b not found: ID does not exist" containerID="6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b" Apr 24 21:41:05.409374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.409329 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b"} err="failed to get container status \"6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b\": rpc error: code = NotFound desc = could not find container \"6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b\": container with ID starting with 6eb1c878c9daa526d7d4c3e543500f222380c2bc71d86b05629336cf320f178b not found: ID does not exist" Apr 24 21:41:05.409374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.409346 2578 scope.go:117] "RemoveContainer" containerID="48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6" Apr 24 21:41:05.409561 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:41:05.409544 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6\": container with ID starting with 48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6 not found: ID does not exist" containerID="48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6" Apr 24 21:41:05.409622 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.409571 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6"} err="failed to get container status \"48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6\": rpc error: code = NotFound desc = could not find container \"48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6\": container with ID starting with 48c338072d158bce5e1835c14cba7923a5213513ee65befea5469972c898edc6 not found: ID does not exist" Apr 24 21:41:05.409622 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.409595 2578 scope.go:117] "RemoveContainer" containerID="9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2" Apr 24 21:41:05.409817 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:41:05.409803 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2\": container with ID starting with 9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2 not found: ID does not exist" containerID="9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2" Apr 24 21:41:05.409862 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.409822 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2"} err="failed to get container status \"9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2\": rpc error: code = NotFound desc = could not find container \"9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2\": container with ID starting with 9cef838c25d2834367fa552cc8734f1037addd0f5de88ce22e917f7a68d883d2 not found: ID does not exist" Apr 24 21:41:05.480525 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.480507 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r967h\" (UniqueName: \"kubernetes.io/projected/5802b461-0b4c-44b4-8e35-9ec2476f7962-kube-api-access-r967h\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:41:05.480614 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.480526 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5802b461-0b4c-44b4-8e35-9ec2476f7962-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:41:05.480614 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.480538 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5802b461-0b4c-44b4-8e35-9ec2476f7962-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:41:05.480614 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.480546 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5802b461-0b4c-44b4-8e35-9ec2476f7962-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:41:05.702288 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.702263 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw"] Apr 24 21:41:05.705616 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:05.705595 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw"] Apr 24 21:41:06.212445 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:06.212411 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 24 21:41:06.217633 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:06.217613 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-gbvvw" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: i/o timeout" Apr 24 21:41:06.660115 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:06.660081 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" path="/var/lib/kubelet/pods/5802b461-0b4c-44b4-8e35-9ec2476f7962/volumes" Apr 24 21:41:08.391252 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:08.391225 2578 generic.go:358] "Generic (PLEG): container finished" podID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerID="a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed" exitCode=0 Apr 24 21:41:08.391539 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:08.391288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" event={"ID":"4da6e542-bfed-4eab-80b4-83d5baba8bb9","Type":"ContainerDied","Data":"a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed"} Apr 24 21:41:09.396305 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:09.396271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" event={"ID":"4da6e542-bfed-4eab-80b4-83d5baba8bb9","Type":"ContainerStarted","Data":"28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa"} Apr 24 21:41:09.396716 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:09.396316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" event={"ID":"4da6e542-bfed-4eab-80b4-83d5baba8bb9","Type":"ContainerStarted","Data":"cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59"} Apr 24 21:41:09.396716 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:09.396546 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:09.414230 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:09.414181 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podStartSLOduration=7.414165662 podStartE2EDuration="7.414165662s" podCreationTimestamp="2026-04-24 21:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:41:09.413673589 +0000 UTC m=+1493.381475651" watchObservedRunningTime="2026-04-24 21:41:09.414165662 +0000 UTC m=+1493.381967724" Apr 24 21:41:10.399891 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:10.399856 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:10.401009 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:10.400979 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:41:11.403576 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:11.403532 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:41:16.407400 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:16.407371 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:41:16.407809 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:16.407784 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:41:16.635562 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:16.635531 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:41:16.636486 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:16.636463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:41:26.407956 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:26.407878 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:41:36.408411 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:36.408367 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:41:46.407976 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:46.407943 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:41:56.408865 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:41:56.408829 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:42:04.366719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.366680 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp"] Apr 24 21:42:04.367156 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.367121 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" containerID="cri-o://cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59" gracePeriod=30 Apr 24 21:42:04.367228 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.367193 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kube-rbac-proxy" containerID="cri-o://28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa" gracePeriod=30 Apr 24 21:42:04.451040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451002 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86"] Apr 24 21:42:04.451352 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451340 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="storage-initializer" Apr 24 21:42:04.451404 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451354 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="storage-initializer" Apr 24 21:42:04.451404 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451375 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kube-rbac-proxy" Apr 24 21:42:04.451404 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451381 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kube-rbac-proxy" Apr 24 21:42:04.451404 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451389 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" Apr 24 21:42:04.451404 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451395 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" Apr 24 21:42:04.451557 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451448 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kube-rbac-proxy" Apr 24 21:42:04.451557 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.451456 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5802b461-0b4c-44b4-8e35-9ec2476f7962" containerName="kserve-container" Apr 24 21:42:04.454564 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.454546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.456928 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.456902 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 24 21:42:04.457070 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.457052 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 24 21:42:04.465844 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.465820 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86"] Apr 24 21:42:04.550050 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.550014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsvcq\" (UniqueName: \"kubernetes.io/projected/709cd00f-b878-4cdf-b8a0-73f9f3842179-kube-api-access-dsvcq\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.550050 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.550050 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/709cd00f-b878-4cdf-b8a0-73f9f3842179-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.550257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.550129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/709cd00f-b878-4cdf-b8a0-73f9f3842179-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.550257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.550148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/709cd00f-b878-4cdf-b8a0-73f9f3842179-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.560275 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.560235 2578 generic.go:358] "Generic (PLEG): container finished" podID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerID="28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa" exitCode=2 Apr 24 21:42:04.560418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.560299 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" event={"ID":"4da6e542-bfed-4eab-80b4-83d5baba8bb9","Type":"ContainerDied","Data":"28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa"} Apr 24 21:42:04.651070 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.651042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsvcq\" (UniqueName: \"kubernetes.io/projected/709cd00f-b878-4cdf-b8a0-73f9f3842179-kube-api-access-dsvcq\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.651229 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.651077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/709cd00f-b878-4cdf-b8a0-73f9f3842179-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.651229 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.651162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/709cd00f-b878-4cdf-b8a0-73f9f3842179-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.651229 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.651186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/709cd00f-b878-4cdf-b8a0-73f9f3842179-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.651562 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.651540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/709cd00f-b878-4cdf-b8a0-73f9f3842179-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.651844 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.651823 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/709cd00f-b878-4cdf-b8a0-73f9f3842179-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.653806 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.653784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/709cd00f-b878-4cdf-b8a0-73f9f3842179-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.659253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.659201 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsvcq\" (UniqueName: \"kubernetes.io/projected/709cd00f-b878-4cdf-b8a0-73f9f3842179-kube-api-access-dsvcq\") pod \"isvc-pmml-predictor-8bb578669-zxg86\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.764620 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.764578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:04.885947 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:04.885921 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86"] Apr 24 21:42:04.888219 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:42:04.888192 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709cd00f_b878_4cdf_b8a0_73f9f3842179.slice/crio-4caaf45e096707a5557c0f382f54df090a3fa65b5a057053e11cfdfa25c10ddb WatchSource:0}: Error finding container 4caaf45e096707a5557c0f382f54df090a3fa65b5a057053e11cfdfa25c10ddb: Status 404 returned error can't find the container with id 4caaf45e096707a5557c0f382f54df090a3fa65b5a057053e11cfdfa25c10ddb Apr 24 21:42:05.566346 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:05.566308 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" event={"ID":"709cd00f-b878-4cdf-b8a0-73f9f3842179","Type":"ContainerStarted","Data":"397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2"} Apr 24 21:42:05.566346 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:05.566346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" event={"ID":"709cd00f-b878-4cdf-b8a0-73f9f3842179","Type":"ContainerStarted","Data":"4caaf45e096707a5557c0f382f54df090a3fa65b5a057053e11cfdfa25c10ddb"} Apr 24 21:42:06.404475 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:06.404432 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 24 21:42:06.408794 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:06.408769 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:42:07.209374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.209351 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:42:07.374804 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.374689 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj2s4\" (UniqueName: \"kubernetes.io/projected/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kube-api-access-pj2s4\") pod \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " Apr 24 21:42:07.374804 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.374782 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls\") pod \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " Apr 24 21:42:07.374804 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.374807 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4da6e542-bfed-4eab-80b4-83d5baba8bb9-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " Apr 24 21:42:07.375034 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.374843 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kserve-provision-location\") pod \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\" (UID: \"4da6e542-bfed-4eab-80b4-83d5baba8bb9\") " Apr 24 21:42:07.375227 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.375193 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4da6e542-bfed-4eab-80b4-83d5baba8bb9-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "4da6e542-bfed-4eab-80b4-83d5baba8bb9" (UID: "4da6e542-bfed-4eab-80b4-83d5baba8bb9"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:42:07.376950 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.376926 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kube-api-access-pj2s4" (OuterVolumeSpecName: "kube-api-access-pj2s4") pod "4da6e542-bfed-4eab-80b4-83d5baba8bb9" (UID: "4da6e542-bfed-4eab-80b4-83d5baba8bb9"). InnerVolumeSpecName "kube-api-access-pj2s4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:42:07.377052 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.376980 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4da6e542-bfed-4eab-80b4-83d5baba8bb9" (UID: "4da6e542-bfed-4eab-80b4-83d5baba8bb9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:42:07.383838 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.383807 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4da6e542-bfed-4eab-80b4-83d5baba8bb9" (UID: "4da6e542-bfed-4eab-80b4-83d5baba8bb9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:07.476402 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.476365 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4da6e542-bfed-4eab-80b4-83d5baba8bb9-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:42:07.476402 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.476397 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4da6e542-bfed-4eab-80b4-83d5baba8bb9-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:42:07.476402 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.476406 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:42:07.476705 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.476418 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pj2s4\" (UniqueName: \"kubernetes.io/projected/4da6e542-bfed-4eab-80b4-83d5baba8bb9-kube-api-access-pj2s4\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:42:07.574209 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.574173 2578 generic.go:358] "Generic (PLEG): container finished" podID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerID="cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59" exitCode=0 Apr 24 21:42:07.574384 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.574257 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" Apr 24 21:42:07.574384 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.574256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" event={"ID":"4da6e542-bfed-4eab-80b4-83d5baba8bb9","Type":"ContainerDied","Data":"cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59"} Apr 24 21:42:07.574384 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.574299 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp" event={"ID":"4da6e542-bfed-4eab-80b4-83d5baba8bb9","Type":"ContainerDied","Data":"f0957c355dbe3bbc418853ebc2740ab4e9f7d1c847cdfdc62c2aedc12c0e4efc"} Apr 24 21:42:07.574384 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.574318 2578 scope.go:117] "RemoveContainer" containerID="28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa" Apr 24 21:42:07.582520 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.582487 2578 scope.go:117] "RemoveContainer" containerID="cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59" Apr 24 21:42:07.589519 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.589504 2578 scope.go:117] "RemoveContainer" containerID="a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed" Apr 24 21:42:07.595800 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.595773 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp"] Apr 24 21:42:07.597555 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.597522 2578 scope.go:117] "RemoveContainer" containerID="28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa" Apr 24 21:42:07.597905 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:42:07.597878 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa\": container with ID starting with 28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa not found: ID does not exist" containerID="28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa" Apr 24 21:42:07.597993 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.597916 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa"} err="failed to get container status \"28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa\": rpc error: code = NotFound desc = could not find container \"28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa\": container with ID starting with 28ce2554bba7d5bd94cb7e7294264aacf33d9357aab206bd2eb7ebc0cf12cefa not found: ID does not exist" Apr 24 21:42:07.597993 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.597942 2578 scope.go:117] "RemoveContainer" containerID="cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59" Apr 24 21:42:07.598192 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:42:07.598173 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59\": container with ID starting with cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59 not found: ID does not exist" containerID="cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59" Apr 24 21:42:07.598257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.598198 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59"} err="failed to get container status \"cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59\": rpc error: code = NotFound desc = could not find container \"cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59\": container with ID starting with cb08e7745ffa9f0e2ceb561e7ecf3c11f5952b884d4d378af668325f8a85ec59 not found: ID does not exist" Apr 24 21:42:07.598257 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.598218 2578 scope.go:117] "RemoveContainer" containerID="a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed" Apr 24 21:42:07.598356 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.598277 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-bnccp"] Apr 24 21:42:07.598436 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:42:07.598422 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed\": container with ID starting with a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed not found: ID does not exist" containerID="a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed" Apr 24 21:42:07.598490 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:07.598439 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed"} err="failed to get container status \"a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed\": rpc error: code = NotFound desc = could not find container \"a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed\": container with ID starting with a162b4b5d0ee5cb741ec78b8e41599bfd00116687ac39d973d52807664c818ed not found: ID does not exist" Apr 24 21:42:08.578822 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:08.578789 2578 generic.go:358] "Generic (PLEG): container finished" podID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerID="397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2" exitCode=0 Apr 24 21:42:08.579163 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:08.578840 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" event={"ID":"709cd00f-b878-4cdf-b8a0-73f9f3842179","Type":"ContainerDied","Data":"397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2"} Apr 24 21:42:08.659990 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:08.659957 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" path="/var/lib/kubelet/pods/4da6e542-bfed-4eab-80b4-83d5baba8bb9/volumes" Apr 24 21:42:15.606650 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:15.606614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" event={"ID":"709cd00f-b878-4cdf-b8a0-73f9f3842179","Type":"ContainerStarted","Data":"0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3"} Apr 24 21:42:15.607054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:15.606660 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" event={"ID":"709cd00f-b878-4cdf-b8a0-73f9f3842179","Type":"ContainerStarted","Data":"801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20"} Apr 24 21:42:15.607054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:15.606889 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:15.626203 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:15.626156 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podStartSLOduration=4.81184991 podStartE2EDuration="11.626138453s" podCreationTimestamp="2026-04-24 21:42:04 +0000 UTC" firstStartedPulling="2026-04-24 21:42:08.579927629 +0000 UTC m=+1552.547729669" lastFinishedPulling="2026-04-24 21:42:15.394216168 +0000 UTC m=+1559.362018212" observedRunningTime="2026-04-24 21:42:15.625204582 +0000 UTC m=+1559.593006669" watchObservedRunningTime="2026-04-24 21:42:15.626138453 +0000 UTC m=+1559.593940517" Apr 24 21:42:16.610274 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:16.610242 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:16.611471 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:16.611443 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:42:17.613021 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:17.612976 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:42:22.617344 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:22.617319 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:42:22.617916 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:22.617888 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:42:32.618164 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:32.618129 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:42:42.618302 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:42.618257 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:42:52.618332 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:42:52.618291 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:43:02.618894 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:02.618812 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:43:12.618844 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:12.618805 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:43:22.618830 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:22.618787 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:43:32.618613 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:32.618583 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:43:35.487276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:35.487242 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86"] Apr 24 21:43:35.487644 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:35.487565 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" containerID="cri-o://801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20" gracePeriod=30 Apr 24 21:43:35.487644 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:35.487573 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kube-rbac-proxy" containerID="cri-o://0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3" gracePeriod=30 Apr 24 21:43:35.840241 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:35.840159 2578 generic.go:358] "Generic (PLEG): container finished" podID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerID="0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3" exitCode=2 Apr 24 21:43:35.840241 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:35.840202 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" event={"ID":"709cd00f-b878-4cdf-b8a0-73f9f3842179","Type":"ContainerDied","Data":"0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3"} Apr 24 21:43:37.613534 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:37.613491 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 24 21:43:38.433549 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.433520 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:43:38.532544 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.532485 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/709cd00f-b878-4cdf-b8a0-73f9f3842179-kserve-provision-location\") pod \"709cd00f-b878-4cdf-b8a0-73f9f3842179\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " Apr 24 21:43:38.532544 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.532517 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/709cd00f-b878-4cdf-b8a0-73f9f3842179-proxy-tls\") pod \"709cd00f-b878-4cdf-b8a0-73f9f3842179\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " Apr 24 21:43:38.532707 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.532553 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsvcq\" (UniqueName: \"kubernetes.io/projected/709cd00f-b878-4cdf-b8a0-73f9f3842179-kube-api-access-dsvcq\") pod \"709cd00f-b878-4cdf-b8a0-73f9f3842179\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " Apr 24 21:43:38.532707 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.532604 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/709cd00f-b878-4cdf-b8a0-73f9f3842179-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"709cd00f-b878-4cdf-b8a0-73f9f3842179\" (UID: \"709cd00f-b878-4cdf-b8a0-73f9f3842179\") " Apr 24 21:43:38.532851 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.532721 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/709cd00f-b878-4cdf-b8a0-73f9f3842179-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "709cd00f-b878-4cdf-b8a0-73f9f3842179" (UID: "709cd00f-b878-4cdf-b8a0-73f9f3842179"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:38.532994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.532968 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/709cd00f-b878-4cdf-b8a0-73f9f3842179-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "709cd00f-b878-4cdf-b8a0-73f9f3842179" (UID: "709cd00f-b878-4cdf-b8a0-73f9f3842179"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:43:38.534596 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.534563 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709cd00f-b878-4cdf-b8a0-73f9f3842179-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "709cd00f-b878-4cdf-b8a0-73f9f3842179" (UID: "709cd00f-b878-4cdf-b8a0-73f9f3842179"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:38.534694 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.534635 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709cd00f-b878-4cdf-b8a0-73f9f3842179-kube-api-access-dsvcq" (OuterVolumeSpecName: "kube-api-access-dsvcq") pod "709cd00f-b878-4cdf-b8a0-73f9f3842179" (UID: "709cd00f-b878-4cdf-b8a0-73f9f3842179"). InnerVolumeSpecName "kube-api-access-dsvcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:43:38.633457 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.633432 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dsvcq\" (UniqueName: \"kubernetes.io/projected/709cd00f-b878-4cdf-b8a0-73f9f3842179-kube-api-access-dsvcq\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:43:38.633457 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.633453 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/709cd00f-b878-4cdf-b8a0-73f9f3842179-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:43:38.633741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.633463 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/709cd00f-b878-4cdf-b8a0-73f9f3842179-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:43:38.633741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.633473 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/709cd00f-b878-4cdf-b8a0-73f9f3842179-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:43:38.850373 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.850311 2578 generic.go:358] "Generic (PLEG): container finished" podID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerID="801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20" exitCode=0 Apr 24 21:43:38.850475 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.850369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" event={"ID":"709cd00f-b878-4cdf-b8a0-73f9f3842179","Type":"ContainerDied","Data":"801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20"} Apr 24 21:43:38.850475 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.850391 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" Apr 24 21:43:38.850475 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.850399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86" event={"ID":"709cd00f-b878-4cdf-b8a0-73f9f3842179","Type":"ContainerDied","Data":"4caaf45e096707a5557c0f382f54df090a3fa65b5a057053e11cfdfa25c10ddb"} Apr 24 21:43:38.850475 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.850418 2578 scope.go:117] "RemoveContainer" containerID="0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3" Apr 24 21:43:38.858423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.858406 2578 scope.go:117] "RemoveContainer" containerID="801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20" Apr 24 21:43:38.865902 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.865880 2578 scope.go:117] "RemoveContainer" containerID="397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2" Apr 24 21:43:38.867010 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.866987 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86"] Apr 24 21:43:38.870702 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.870671 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zxg86"] Apr 24 21:43:38.873386 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.873365 2578 scope.go:117] "RemoveContainer" containerID="0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3" Apr 24 21:43:38.873643 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:43:38.873623 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3\": container with ID starting with 0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3 not found: ID does not exist" containerID="0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3" Apr 24 21:43:38.873724 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.873655 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3"} err="failed to get container status \"0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3\": rpc error: code = NotFound desc = could not find container \"0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3\": container with ID starting with 0589308e67974c486c30bf5399aefb764c3a69040d1d167174972cac858821d3 not found: ID does not exist" Apr 24 21:43:38.873724 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.873679 2578 scope.go:117] "RemoveContainer" containerID="801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20" Apr 24 21:43:38.873946 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:43:38.873928 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20\": container with ID starting with 801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20 not found: ID does not exist" containerID="801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20" Apr 24 21:43:38.874004 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.873952 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20"} err="failed to get container status \"801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20\": rpc error: code = NotFound desc = could not find container \"801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20\": container with ID starting with 801864b3a3ec7e80578194b0f961478944f241ba2ad6e197e1b25122181fcf20 not found: ID does not exist" Apr 24 21:43:38.874004 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.873969 2578 scope.go:117] "RemoveContainer" containerID="397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2" Apr 24 21:43:38.874204 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:43:38.874189 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2\": container with ID starting with 397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2 not found: ID does not exist" containerID="397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2" Apr 24 21:43:38.874242 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:38.874209 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2"} err="failed to get container status \"397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2\": rpc error: code = NotFound desc = could not find container \"397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2\": container with ID starting with 397b7be0021ecb077ed5778b7cdc8c556626fae8b1c9b19ea9bdb8df992f29e2 not found: ID does not exist" Apr 24 21:43:40.660008 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:43:40.659975 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" path="/var/lib/kubelet/pods/709cd00f-b878-4cdf-b8a0-73f9f3842179/volumes" Apr 24 21:46:16.657279 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:16.657245 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:46:16.660623 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:16.660602 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:46:58.335870 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.335836 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td"] Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336202 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336219 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336235 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kube-rbac-proxy" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336241 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kube-rbac-proxy" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336253 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336258 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336269 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="storage-initializer" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336274 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="storage-initializer" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336280 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="storage-initializer" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336285 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="storage-initializer" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336291 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kube-rbac-proxy" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336296 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kube-rbac-proxy" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336351 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kube-rbac-proxy" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336360 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="709cd00f-b878-4cdf-b8a0-73f9f3842179" containerName="kserve-container" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336366 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kube-rbac-proxy" Apr 24 21:46:58.338090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.336374 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4da6e542-bfed-4eab-80b4-83d5baba8bb9" containerName="kserve-container" Apr 24 21:46:58.339169 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.339154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.343381 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.343349 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:46:58.343540 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.343407 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-d8276f-kube-rbac-proxy-sar-config\"" Apr 24 21:46:58.343540 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.343443 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-d8276f-predictor-serving-cert\"" Apr 24 21:46:58.344260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.344242 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:46:58.344882 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.344866 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:46:58.366211 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.366184 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td"] Apr 24 21:46:58.462688 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.462659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa9284c6-f35f-424e-b0ab-ece265617b53-kserve-provision-location\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.462830 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.462701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggpl\" (UniqueName: \"kubernetes.io/projected/fa9284c6-f35f-424e-b0ab-ece265617b53-kube-api-access-bggpl\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.462830 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.462728 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fa9284c6-f35f-424e-b0ab-ece265617b53-isvc-primary-d8276f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.462830 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.462772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa9284c6-f35f-424e-b0ab-ece265617b53-proxy-tls\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.563683 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.563658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bggpl\" (UniqueName: \"kubernetes.io/projected/fa9284c6-f35f-424e-b0ab-ece265617b53-kube-api-access-bggpl\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.563809 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.563697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fa9284c6-f35f-424e-b0ab-ece265617b53-isvc-primary-d8276f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.563809 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.563719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa9284c6-f35f-424e-b0ab-ece265617b53-proxy-tls\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.563809 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.563789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa9284c6-f35f-424e-b0ab-ece265617b53-kserve-provision-location\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.564166 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.564147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa9284c6-f35f-424e-b0ab-ece265617b53-kserve-provision-location\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.564298 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.564280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fa9284c6-f35f-424e-b0ab-ece265617b53-isvc-primary-d8276f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.566338 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.566319 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa9284c6-f35f-424e-b0ab-ece265617b53-proxy-tls\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.576156 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.576127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggpl\" (UniqueName: \"kubernetes.io/projected/fa9284c6-f35f-424e-b0ab-ece265617b53-kube-api-access-bggpl\") pod \"isvc-primary-d8276f-predictor-5555dbfb49-pz2td\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.650956 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.650934 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:46:58.798183 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.798160 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td"] Apr 24 21:46:58.800806 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:46:58.800779 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9284c6_f35f_424e_b0ab_ece265617b53.slice/crio-49be55d8222d317ed2c209e7a0cbaf5cec6930230dfd84be894bfa1b0babadf7 WatchSource:0}: Error finding container 49be55d8222d317ed2c209e7a0cbaf5cec6930230dfd84be894bfa1b0babadf7: Status 404 returned error can't find the container with id 49be55d8222d317ed2c209e7a0cbaf5cec6930230dfd84be894bfa1b0babadf7 Apr 24 21:46:58.802581 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:58.802560 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:46:59.426625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:59.426589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" event={"ID":"fa9284c6-f35f-424e-b0ab-ece265617b53","Type":"ContainerStarted","Data":"d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161"} Apr 24 21:46:59.426625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:46:59.426625 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" event={"ID":"fa9284c6-f35f-424e-b0ab-ece265617b53","Type":"ContainerStarted","Data":"49be55d8222d317ed2c209e7a0cbaf5cec6930230dfd84be894bfa1b0babadf7"} Apr 24 21:47:03.440334 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:03.440297 2578 generic.go:358] "Generic (PLEG): container finished" podID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerID="d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161" exitCode=0 Apr 24 21:47:03.440827 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:03.440382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" event={"ID":"fa9284c6-f35f-424e-b0ab-ece265617b53","Type":"ContainerDied","Data":"d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161"} Apr 24 21:47:04.445839 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:04.445802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" event={"ID":"fa9284c6-f35f-424e-b0ab-ece265617b53","Type":"ContainerStarted","Data":"af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc"} Apr 24 21:47:04.445839 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:04.445843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" event={"ID":"fa9284c6-f35f-424e-b0ab-ece265617b53","Type":"ContainerStarted","Data":"7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9"} Apr 24 21:47:04.446225 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:04.446039 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:47:04.465202 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:04.465155 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podStartSLOduration=6.465141707 podStartE2EDuration="6.465141707s" podCreationTimestamp="2026-04-24 21:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:04.462885865 +0000 UTC m=+1848.430687928" watchObservedRunningTime="2026-04-24 21:47:04.465141707 +0000 UTC m=+1848.432943847" Apr 24 21:47:05.449018 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:05.448976 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:47:05.450200 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:05.450171 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:47:06.451224 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:06.451185 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:47:11.455311 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:11.455284 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:47:11.455733 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:11.455708 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:47:21.456509 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:21.456464 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:47:31.455814 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:31.455718 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:47:41.455704 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:41.455666 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:47:51.456168 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:47:51.456127 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:48:01.455911 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:01.455883 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:48:08.344951 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.344915 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7"] Apr 24 21:48:08.348801 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.348741 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.352090 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.352069 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-d8276f\"" Apr 24 21:48:08.352208 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.352079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-d8276f-predictor-serving-cert\"" Apr 24 21:48:08.352208 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.352174 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-d8276f-dockercfg-shghl\"" Apr 24 21:48:08.352349 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.352331 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 21:48:08.352431 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.352414 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-d8276f-kube-rbac-proxy-sar-config\"" Apr 24 21:48:08.361442 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.361422 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7"] Apr 24 21:48:08.373134 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.373113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-cabundle-cert\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.373230 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.373156 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ea2f55-4922-492d-a1dc-262944b17b3a-kserve-provision-location\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.373230 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.373217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.373304 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.373249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqkmr\" (UniqueName: \"kubernetes.io/projected/99ea2f55-4922-492d-a1dc-262944b17b3a-kube-api-access-sqkmr\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.373304 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.373292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-isvc-secondary-d8276f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.474473 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.474444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.474566 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.474488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqkmr\" (UniqueName: \"kubernetes.io/projected/99ea2f55-4922-492d-a1dc-262944b17b3a-kube-api-access-sqkmr\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.474566 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.474515 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-isvc-secondary-d8276f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.474566 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.474532 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-cabundle-cert\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.474693 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.474567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ea2f55-4922-492d-a1dc-262944b17b3a-kserve-provision-location\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.474693 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:08.474606 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-serving-cert: secret "isvc-secondary-d8276f-predictor-serving-cert" not found Apr 24 21:48:08.474693 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:08.474674 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls podName:99ea2f55-4922-492d-a1dc-262944b17b3a nodeName:}" failed. No retries permitted until 2026-04-24 21:48:08.97465537 +0000 UTC m=+1912.942457410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls") pod "isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" (UID: "99ea2f55-4922-492d-a1dc-262944b17b3a") : secret "isvc-secondary-d8276f-predictor-serving-cert" not found Apr 24 21:48:08.475006 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.474988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ea2f55-4922-492d-a1dc-262944b17b3a-kserve-provision-location\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.475247 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.475227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-isvc-secondary-d8276f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.475285 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.475227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-cabundle-cert\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.485275 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.485254 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqkmr\" (UniqueName: \"kubernetes.io/projected/99ea2f55-4922-492d-a1dc-262944b17b3a-kube-api-access-sqkmr\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.978171 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.978139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:08.980600 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:08.980580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls\") pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:09.259894 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:09.259832 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:09.379789 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:09.379761 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7"] Apr 24 21:48:09.382244 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:48:09.382202 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ea2f55_4922_492d_a1dc_262944b17b3a.slice/crio-9c9195be46638b9aca4bd1e88d0f594c4ebc85c82d9d3226b4a0561b756fc098 WatchSource:0}: Error finding container 9c9195be46638b9aca4bd1e88d0f594c4ebc85c82d9d3226b4a0561b756fc098: Status 404 returned error can't find the container with id 9c9195be46638b9aca4bd1e88d0f594c4ebc85c82d9d3226b4a0561b756fc098 Apr 24 21:48:09.633374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:09.633290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" event={"ID":"99ea2f55-4922-492d-a1dc-262944b17b3a","Type":"ContainerStarted","Data":"572949736f3544efbe1968d4007dbd053e8a9caa26da097b3e5a67edaf1928b2"} Apr 24 21:48:09.633374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:09.633330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" event={"ID":"99ea2f55-4922-492d-a1dc-262944b17b3a","Type":"ContainerStarted","Data":"9c9195be46638b9aca4bd1e88d0f594c4ebc85c82d9d3226b4a0561b756fc098"} Apr 24 21:48:13.646523 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:13.646496 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_99ea2f55-4922-492d-a1dc-262944b17b3a/storage-initializer/0.log" Apr 24 21:48:13.646848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:13.646537 2578 generic.go:358] "Generic (PLEG): container finished" podID="99ea2f55-4922-492d-a1dc-262944b17b3a" containerID="572949736f3544efbe1968d4007dbd053e8a9caa26da097b3e5a67edaf1928b2" exitCode=1 Apr 24 21:48:13.646848 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:13.646620 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" event={"ID":"99ea2f55-4922-492d-a1dc-262944b17b3a","Type":"ContainerDied","Data":"572949736f3544efbe1968d4007dbd053e8a9caa26da097b3e5a67edaf1928b2"} Apr 24 21:48:14.651018 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:14.650992 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_99ea2f55-4922-492d-a1dc-262944b17b3a/storage-initializer/0.log" Apr 24 21:48:14.651362 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:14.651108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" event={"ID":"99ea2f55-4922-492d-a1dc-262944b17b3a","Type":"ContainerStarted","Data":"f101b3ff7aa86dfafbaaae16f92a47ffa30e9b39978983d800925f7f8934602c"} Apr 24 21:48:17.662040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:17.662015 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_99ea2f55-4922-492d-a1dc-262944b17b3a/storage-initializer/1.log" Apr 24 21:48:17.662425 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:17.662324 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_99ea2f55-4922-492d-a1dc-262944b17b3a/storage-initializer/0.log" Apr 24 21:48:17.662425 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:17.662353 2578 generic.go:358] "Generic (PLEG): container finished" podID="99ea2f55-4922-492d-a1dc-262944b17b3a" containerID="f101b3ff7aa86dfafbaaae16f92a47ffa30e9b39978983d800925f7f8934602c" exitCode=1 Apr 24 21:48:17.662533 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:17.662452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" event={"ID":"99ea2f55-4922-492d-a1dc-262944b17b3a","Type":"ContainerDied","Data":"f101b3ff7aa86dfafbaaae16f92a47ffa30e9b39978983d800925f7f8934602c"} Apr 24 21:48:17.662533 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:17.662504 2578 scope.go:117] "RemoveContainer" containerID="572949736f3544efbe1968d4007dbd053e8a9caa26da097b3e5a67edaf1928b2" Apr 24 21:48:17.662853 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:17.662829 2578 scope.go:117] "RemoveContainer" containerID="572949736f3544efbe1968d4007dbd053e8a9caa26da097b3e5a67edaf1928b2" Apr 24 21:48:17.675565 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:17.675527 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_kserve-ci-e2e-test_99ea2f55-4922-492d-a1dc-262944b17b3a_0 in pod sandbox 9c9195be46638b9aca4bd1e88d0f594c4ebc85c82d9d3226b4a0561b756fc098 from index: no such id: '572949736f3544efbe1968d4007dbd053e8a9caa26da097b3e5a67edaf1928b2'" containerID="572949736f3544efbe1968d4007dbd053e8a9caa26da097b3e5a67edaf1928b2" Apr 24 21:48:17.675650 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:17.675591 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_kserve-ci-e2e-test_99ea2f55-4922-492d-a1dc-262944b17b3a_0 in pod sandbox 9c9195be46638b9aca4bd1e88d0f594c4ebc85c82d9d3226b4a0561b756fc098 from index: no such id: '572949736f3544efbe1968d4007dbd053e8a9caa26da097b3e5a67edaf1928b2'; Skipping pod \"isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_kserve-ci-e2e-test(99ea2f55-4922-492d-a1dc-262944b17b3a)\"" logger="UnhandledError" Apr 24 21:48:17.676937 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:17.676916 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_kserve-ci-e2e-test(99ea2f55-4922-492d-a1dc-262944b17b3a)\"" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" podUID="99ea2f55-4922-492d-a1dc-262944b17b3a" Apr 24 21:48:18.666403 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:18.666378 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_99ea2f55-4922-492d-a1dc-262944b17b3a/storage-initializer/1.log" Apr 24 21:48:24.382154 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.382121 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td"] Apr 24 21:48:24.382541 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.382496 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" containerID="cri-o://7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9" gracePeriod=30 Apr 24 21:48:24.382623 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.382520 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kube-rbac-proxy" containerID="cri-o://af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc" gracePeriod=30 Apr 24 21:48:24.441349 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.441319 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7"] Apr 24 21:48:24.525740 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.525710 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj"] Apr 24 21:48:24.529427 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.529407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.531770 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.531721 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-30f3f9\"" Apr 24 21:48:24.531887 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.531778 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\"" Apr 24 21:48:24.531887 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.531838 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-30f3f9-dockercfg-jsgjz\"" Apr 24 21:48:24.531994 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.531886 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-30f3f9-predictor-serving-cert\"" Apr 24 21:48:24.539891 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.539873 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj"] Apr 24 21:48:24.577523 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.577506 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_99ea2f55-4922-492d-a1dc-262944b17b3a/storage-initializer/1.log" Apr 24 21:48:24.577646 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.577564 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:24.593934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.593913 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.594024 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.593943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.594024 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.593986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-cabundle-cert\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.594132 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.594051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqln2\" (UniqueName: \"kubernetes.io/projected/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kube-api-access-wqln2\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.594132 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.594071 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kserve-provision-location\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.684049 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.684021 2578 generic.go:358] "Generic (PLEG): container finished" podID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerID="af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc" exitCode=2 Apr 24 21:48:24.684173 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.684088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" event={"ID":"fa9284c6-f35f-424e-b0ab-ece265617b53","Type":"ContainerDied","Data":"af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc"} Apr 24 21:48:24.685167 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.685142 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7_99ea2f55-4922-492d-a1dc-262944b17b3a/storage-initializer/1.log" Apr 24 21:48:24.685268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.685202 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" event={"ID":"99ea2f55-4922-492d-a1dc-262944b17b3a","Type":"ContainerDied","Data":"9c9195be46638b9aca4bd1e88d0f594c4ebc85c82d9d3226b4a0561b756fc098"} Apr 24 21:48:24.685268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.685230 2578 scope.go:117] "RemoveContainer" containerID="f101b3ff7aa86dfafbaaae16f92a47ffa30e9b39978983d800925f7f8934602c" Apr 24 21:48:24.685268 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.685239 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7" Apr 24 21:48:24.694332 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694315 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqkmr\" (UniqueName: \"kubernetes.io/projected/99ea2f55-4922-492d-a1dc-262944b17b3a-kube-api-access-sqkmr\") pod \"99ea2f55-4922-492d-a1dc-262944b17b3a\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " Apr 24 21:48:24.694397 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694370 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-isvc-secondary-d8276f-kube-rbac-proxy-sar-config\") pod \"99ea2f55-4922-492d-a1dc-262944b17b3a\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " Apr 24 21:48:24.694485 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694430 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ea2f55-4922-492d-a1dc-262944b17b3a-kserve-provision-location\") pod \"99ea2f55-4922-492d-a1dc-262944b17b3a\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " Apr 24 21:48:24.694525 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694493 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-cabundle-cert\") pod \"99ea2f55-4922-492d-a1dc-262944b17b3a\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " Apr 24 21:48:24.694525 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694516 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls\") pod \"99ea2f55-4922-492d-a1dc-262944b17b3a\" (UID: \"99ea2f55-4922-492d-a1dc-262944b17b3a\") " Apr 24 21:48:24.694663 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqln2\" (UniqueName: \"kubernetes.io/projected/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kube-api-access-wqln2\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.694741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kserve-provision-location\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.694741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.694741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694725 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ea2f55-4922-492d-a1dc-262944b17b3a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "99ea2f55-4922-492d-a1dc-262944b17b3a" (UID: "99ea2f55-4922-492d-a1dc-262944b17b3a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:24.694920 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.694920 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694819 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-isvc-secondary-d8276f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-d8276f-kube-rbac-proxy-sar-config") pod "99ea2f55-4922-492d-a1dc-262944b17b3a" (UID: "99ea2f55-4922-492d-a1dc-262944b17b3a"). InnerVolumeSpecName "isvc-secondary-d8276f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:24.694920 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-cabundle-cert\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.694920 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694853 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "99ea2f55-4922-492d-a1dc-262944b17b3a" (UID: "99ea2f55-4922-492d-a1dc-262944b17b3a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:24.694920 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694895 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-isvc-secondary-d8276f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:24.694920 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.694911 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ea2f55-4922-492d-a1dc-262944b17b3a-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:24.694920 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:24.694915 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-serving-cert: secret "isvc-init-fail-30f3f9-predictor-serving-cert" not found Apr 24 21:48:24.695218 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:24.694972 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls podName:dc77c860-8b3d-4516-8561-7dd8ffe42bbe nodeName:}" failed. No retries permitted until 2026-04-24 21:48:25.194953683 +0000 UTC m=+1929.162755731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls") pod "isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" (UID: "dc77c860-8b3d-4516-8561-7dd8ffe42bbe") : secret "isvc-init-fail-30f3f9-predictor-serving-cert" not found Apr 24 21:48:24.695218 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.695118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kserve-provision-location\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.695438 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.695421 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-cabundle-cert\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.695523 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.695504 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.697049 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.697026 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "99ea2f55-4922-492d-a1dc-262944b17b3a" (UID: "99ea2f55-4922-492d-a1dc-262944b17b3a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:24.697133 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.697121 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ea2f55-4922-492d-a1dc-262944b17b3a-kube-api-access-sqkmr" (OuterVolumeSpecName: "kube-api-access-sqkmr") pod "99ea2f55-4922-492d-a1dc-262944b17b3a" (UID: "99ea2f55-4922-492d-a1dc-262944b17b3a"). InnerVolumeSpecName "kube-api-access-sqkmr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:24.705117 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.705096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqln2\" (UniqueName: \"kubernetes.io/projected/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kube-api-access-wqln2\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:24.795874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.795852 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/99ea2f55-4922-492d-a1dc-262944b17b3a-cabundle-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:24.795874 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.795876 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ea2f55-4922-492d-a1dc-262944b17b3a-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:24.796034 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:24.795890 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sqkmr\" (UniqueName: \"kubernetes.io/projected/99ea2f55-4922-492d-a1dc-262944b17b3a-kube-api-access-sqkmr\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:25.020048 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:25.020023 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7"] Apr 24 21:48:25.025423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:25.025400 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d8276f-predictor-85d5b564f4-rz7x7"] Apr 24 21:48:25.198494 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:25.198468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:25.200934 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:25.200912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls\") pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:25.442017 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:25.441991 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:25.562288 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:25.562263 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj"] Apr 24 21:48:25.565174 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:48:25.565142 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc77c860_8b3d_4516_8561_7dd8ffe42bbe.slice/crio-cc31f88f2dc41b5d2e19e5eb61c7e29eb1bbc8862bc6693bc3bf5ff63f695e46 WatchSource:0}: Error finding container cc31f88f2dc41b5d2e19e5eb61c7e29eb1bbc8862bc6693bc3bf5ff63f695e46: Status 404 returned error can't find the container with id cc31f88f2dc41b5d2e19e5eb61c7e29eb1bbc8862bc6693bc3bf5ff63f695e46 Apr 24 21:48:25.689283 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:25.689251 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" event={"ID":"dc77c860-8b3d-4516-8561-7dd8ffe42bbe","Type":"ContainerStarted","Data":"595765ff317a29247bbe62fe7478f4ac3c39782be3a1e5e26d91b6cca3e58828"} Apr 24 21:48:25.689422 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:25.689291 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" event={"ID":"dc77c860-8b3d-4516-8561-7dd8ffe42bbe","Type":"ContainerStarted","Data":"cc31f88f2dc41b5d2e19e5eb61c7e29eb1bbc8862bc6693bc3bf5ff63f695e46"} Apr 24 21:48:26.452224 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:26.452183 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 24 21:48:26.659717 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:26.659686 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ea2f55-4922-492d-a1dc-262944b17b3a" path="/var/lib/kubelet/pods/99ea2f55-4922-492d-a1dc-262944b17b3a/volumes" Apr 24 21:48:28.229435 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.229412 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:48:28.321112 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.321050 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bggpl\" (UniqueName: \"kubernetes.io/projected/fa9284c6-f35f-424e-b0ab-ece265617b53-kube-api-access-bggpl\") pod \"fa9284c6-f35f-424e-b0ab-ece265617b53\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " Apr 24 21:48:28.321112 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.321104 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fa9284c6-f35f-424e-b0ab-ece265617b53-isvc-primary-d8276f-kube-rbac-proxy-sar-config\") pod \"fa9284c6-f35f-424e-b0ab-ece265617b53\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " Apr 24 21:48:28.321276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.321132 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa9284c6-f35f-424e-b0ab-ece265617b53-proxy-tls\") pod \"fa9284c6-f35f-424e-b0ab-ece265617b53\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " Apr 24 21:48:28.321276 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.321255 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa9284c6-f35f-424e-b0ab-ece265617b53-kserve-provision-location\") pod \"fa9284c6-f35f-424e-b0ab-ece265617b53\" (UID: \"fa9284c6-f35f-424e-b0ab-ece265617b53\") " Apr 24 21:48:28.321514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.321488 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9284c6-f35f-424e-b0ab-ece265617b53-isvc-primary-d8276f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-d8276f-kube-rbac-proxy-sar-config") pod "fa9284c6-f35f-424e-b0ab-ece265617b53" (UID: "fa9284c6-f35f-424e-b0ab-ece265617b53"). InnerVolumeSpecName "isvc-primary-d8276f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:28.321514 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.321494 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9284c6-f35f-424e-b0ab-ece265617b53-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fa9284c6-f35f-424e-b0ab-ece265617b53" (UID: "fa9284c6-f35f-424e-b0ab-ece265617b53"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:28.323286 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.323265 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa9284c6-f35f-424e-b0ab-ece265617b53-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fa9284c6-f35f-424e-b0ab-ece265617b53" (UID: "fa9284c6-f35f-424e-b0ab-ece265617b53"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:28.323356 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.323328 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9284c6-f35f-424e-b0ab-ece265617b53-kube-api-access-bggpl" (OuterVolumeSpecName: "kube-api-access-bggpl") pod "fa9284c6-f35f-424e-b0ab-ece265617b53" (UID: "fa9284c6-f35f-424e-b0ab-ece265617b53"). InnerVolumeSpecName "kube-api-access-bggpl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:28.422515 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.422493 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa9284c6-f35f-424e-b0ab-ece265617b53-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:28.422515 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.422513 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bggpl\" (UniqueName: \"kubernetes.io/projected/fa9284c6-f35f-424e-b0ab-ece265617b53-kube-api-access-bggpl\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:28.422630 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.422523 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-d8276f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fa9284c6-f35f-424e-b0ab-ece265617b53-isvc-primary-d8276f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:28.422630 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.422533 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa9284c6-f35f-424e-b0ab-ece265617b53-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:28.701168 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.701138 2578 generic.go:358] "Generic (PLEG): container finished" podID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerID="7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9" exitCode=0 Apr 24 21:48:28.701274 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.701212 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" Apr 24 21:48:28.701274 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.701215 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" event={"ID":"fa9284c6-f35f-424e-b0ab-ece265617b53","Type":"ContainerDied","Data":"7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9"} Apr 24 21:48:28.701274 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.701251 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td" event={"ID":"fa9284c6-f35f-424e-b0ab-ece265617b53","Type":"ContainerDied","Data":"49be55d8222d317ed2c209e7a0cbaf5cec6930230dfd84be894bfa1b0babadf7"} Apr 24 21:48:28.701274 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.701266 2578 scope.go:117] "RemoveContainer" containerID="af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc" Apr 24 21:48:28.710766 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.710727 2578 scope.go:117] "RemoveContainer" containerID="7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9" Apr 24 21:48:28.717553 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.717537 2578 scope.go:117] "RemoveContainer" containerID="d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161" Apr 24 21:48:28.721139 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.721120 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td"] Apr 24 21:48:28.724847 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.724821 2578 scope.go:117] "RemoveContainer" containerID="af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc" Apr 24 21:48:28.725199 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:28.725175 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc\": container with ID starting with af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc not found: ID does not exist" containerID="af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc" Apr 24 21:48:28.725303 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.725210 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc"} err="failed to get container status \"af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc\": rpc error: code = NotFound desc = could not find container \"af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc\": container with ID starting with af8745bcdf6e501b6f091ae77f564eb7749d96a616ab18d319b1ed43bdc573cc not found: ID does not exist" Apr 24 21:48:28.725303 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.725234 2578 scope.go:117] "RemoveContainer" containerID="7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9" Apr 24 21:48:28.725488 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:28.725466 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9\": container with ID starting with 7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9 not found: ID does not exist" containerID="7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9" Apr 24 21:48:28.725585 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.725490 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9"} err="failed to get container status \"7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9\": rpc error: code = NotFound desc = could not find container \"7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9\": container with ID starting with 7951a2c62330951bd609e49b24ee04b61e315622306d1ae24d283f493acd8fe9 not found: ID does not exist" Apr 24 21:48:28.725585 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.725504 2578 scope.go:117] "RemoveContainer" containerID="d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161" Apr 24 21:48:28.725803 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:28.725728 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161\": container with ID starting with d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161 not found: ID does not exist" containerID="d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161" Apr 24 21:48:28.725803 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.725774 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161"} err="failed to get container status \"d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161\": rpc error: code = NotFound desc = could not find container \"d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161\": container with ID starting with d3d9084afa3f99a5184f0d0007e94b39055cc272933aaa866caca87354bbf161 not found: ID does not exist" Apr 24 21:48:28.726889 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:28.726872 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d8276f-predictor-5555dbfb49-pz2td"] Apr 24 21:48:29.707315 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:29.707288 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_dc77c860-8b3d-4516-8561-7dd8ffe42bbe/storage-initializer/0.log" Apr 24 21:48:29.707706 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:29.707326 2578 generic.go:358] "Generic (PLEG): container finished" podID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" containerID="595765ff317a29247bbe62fe7478f4ac3c39782be3a1e5e26d91b6cca3e58828" exitCode=1 Apr 24 21:48:29.707706 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:29.707369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" event={"ID":"dc77c860-8b3d-4516-8561-7dd8ffe42bbe","Type":"ContainerDied","Data":"595765ff317a29247bbe62fe7478f4ac3c39782be3a1e5e26d91b6cca3e58828"} Apr 24 21:48:30.659455 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:30.659423 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" path="/var/lib/kubelet/pods/fa9284c6-f35f-424e-b0ab-ece265617b53/volumes" Apr 24 21:48:30.711606 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:30.711581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_dc77c860-8b3d-4516-8561-7dd8ffe42bbe/storage-initializer/0.log" Apr 24 21:48:30.711978 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:30.711694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" event={"ID":"dc77c860-8b3d-4516-8561-7dd8ffe42bbe","Type":"ContainerStarted","Data":"6696934852d7904452aff159dd4031fb0ccf43dc6794e95c73dc9b4417cd5d56"} Apr 24 21:48:32.719806 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:32.719779 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_dc77c860-8b3d-4516-8561-7dd8ffe42bbe/storage-initializer/1.log" Apr 24 21:48:32.720191 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:32.720151 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_dc77c860-8b3d-4516-8561-7dd8ffe42bbe/storage-initializer/0.log" Apr 24 21:48:32.720233 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:32.720184 2578 generic.go:358] "Generic (PLEG): container finished" podID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" containerID="6696934852d7904452aff159dd4031fb0ccf43dc6794e95c73dc9b4417cd5d56" exitCode=1 Apr 24 21:48:32.720233 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:32.720210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" event={"ID":"dc77c860-8b3d-4516-8561-7dd8ffe42bbe","Type":"ContainerDied","Data":"6696934852d7904452aff159dd4031fb0ccf43dc6794e95c73dc9b4417cd5d56"} Apr 24 21:48:32.720297 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:32.720234 2578 scope.go:117] "RemoveContainer" containerID="595765ff317a29247bbe62fe7478f4ac3c39782be3a1e5e26d91b6cca3e58828" Apr 24 21:48:32.720586 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:32.720567 2578 scope.go:117] "RemoveContainer" containerID="595765ff317a29247bbe62fe7478f4ac3c39782be3a1e5e26d91b6cca3e58828" Apr 24 21:48:32.730291 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:32.730265 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_kserve-ci-e2e-test_dc77c860-8b3d-4516-8561-7dd8ffe42bbe_0 in pod sandbox cc31f88f2dc41b5d2e19e5eb61c7e29eb1bbc8862bc6693bc3bf5ff63f695e46 from index: no such id: '595765ff317a29247bbe62fe7478f4ac3c39782be3a1e5e26d91b6cca3e58828'" containerID="595765ff317a29247bbe62fe7478f4ac3c39782be3a1e5e26d91b6cca3e58828" Apr 24 21:48:32.730360 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:32.730309 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_kserve-ci-e2e-test_dc77c860-8b3d-4516-8561-7dd8ffe42bbe_0 in pod sandbox cc31f88f2dc41b5d2e19e5eb61c7e29eb1bbc8862bc6693bc3bf5ff63f695e46 from index: no such id: '595765ff317a29247bbe62fe7478f4ac3c39782be3a1e5e26d91b6cca3e58828'; Skipping pod \"isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_kserve-ci-e2e-test(dc77c860-8b3d-4516-8561-7dd8ffe42bbe)\"" logger="UnhandledError" Apr 24 21:48:32.731607 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:48:32.731589 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_kserve-ci-e2e-test(dc77c860-8b3d-4516-8561-7dd8ffe42bbe)\"" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" podUID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" Apr 24 21:48:33.725636 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:33.725611 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_dc77c860-8b3d-4516-8561-7dd8ffe42bbe/storage-initializer/1.log" Apr 24 21:48:34.546610 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.546579 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj"] Apr 24 21:48:34.684477 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.684456 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_dc77c860-8b3d-4516-8561-7dd8ffe42bbe/storage-initializer/1.log" Apr 24 21:48:34.684585 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.684520 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:34.729512 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.729484 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj_dc77c860-8b3d-4516-8561-7dd8ffe42bbe/storage-initializer/1.log" Apr 24 21:48:34.729835 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.729531 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" event={"ID":"dc77c860-8b3d-4516-8561-7dd8ffe42bbe","Type":"ContainerDied","Data":"cc31f88f2dc41b5d2e19e5eb61c7e29eb1bbc8862bc6693bc3bf5ff63f695e46"} Apr 24 21:48:34.729835 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.729572 2578 scope.go:117] "RemoveContainer" containerID="6696934852d7904452aff159dd4031fb0ccf43dc6794e95c73dc9b4417cd5d56" Apr 24 21:48:34.729835 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.729615 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj" Apr 24 21:48:34.771862 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.771840 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kserve-provision-location\") pod \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " Apr 24 21:48:34.771951 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.771879 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls\") pod \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " Apr 24 21:48:34.771951 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.771902 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-cabundle-cert\") pod \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " Apr 24 21:48:34.771951 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.771922 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqln2\" (UniqueName: \"kubernetes.io/projected/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kube-api-access-wqln2\") pod \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " Apr 24 21:48:34.772067 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.771978 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\") pod \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\" (UID: \"dc77c860-8b3d-4516-8561-7dd8ffe42bbe\") " Apr 24 21:48:34.772172 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.772146 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dc77c860-8b3d-4516-8561-7dd8ffe42bbe" (UID: "dc77c860-8b3d-4516-8561-7dd8ffe42bbe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:34.772326 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.772302 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "dc77c860-8b3d-4516-8561-7dd8ffe42bbe" (UID: "dc77c860-8b3d-4516-8561-7dd8ffe42bbe"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:34.772415 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.772385 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config") pod "dc77c860-8b3d-4516-8561-7dd8ffe42bbe" (UID: "dc77c860-8b3d-4516-8561-7dd8ffe42bbe"). InnerVolumeSpecName "isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:34.773917 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.773885 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dc77c860-8b3d-4516-8561-7dd8ffe42bbe" (UID: "dc77c860-8b3d-4516-8561-7dd8ffe42bbe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:34.774048 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.774029 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kube-api-access-wqln2" (OuterVolumeSpecName: "kube-api-access-wqln2") pod "dc77c860-8b3d-4516-8561-7dd8ffe42bbe" (UID: "dc77c860-8b3d-4516-8561-7dd8ffe42bbe"). InnerVolumeSpecName "kube-api-access-wqln2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:34.873135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.873079 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-isvc-init-fail-30f3f9-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:34.873135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.873105 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:34.873135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.873119 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:34.873135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.873132 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-cabundle-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:34.873312 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:34.873144 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqln2\" (UniqueName: \"kubernetes.io/projected/dc77c860-8b3d-4516-8561-7dd8ffe42bbe-kube-api-access-wqln2\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:48:35.067976 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:35.067953 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj"] Apr 24 21:48:35.070944 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:35.070923 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-30f3f9-predictor-8694459b44-4f5lj"] Apr 24 21:48:36.659431 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:48:36.659393 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" path="/var/lib/kubelet/pods/dc77c860-8b3d-4516-8561-7dd8ffe42bbe/volumes" Apr 24 21:50:24.887357 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887277 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf"] Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887625 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99ea2f55-4922-492d-a1dc-262944b17b3a" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887637 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ea2f55-4922-492d-a1dc-262944b17b3a" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887649 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887656 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887667 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887673 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887692 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887698 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887704 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kube-rbac-proxy" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887709 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kube-rbac-proxy" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887716 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887720 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887788 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kserve-container" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887798 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887804 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc77c860-8b3d-4516-8561-7dd8ffe42bbe" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887810 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="99ea2f55-4922-492d-a1dc-262944b17b3a" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887817 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="99ea2f55-4922-492d-a1dc-262944b17b3a" containerName="storage-initializer" Apr 24 21:50:24.887816 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887825 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa9284c6-f35f-424e-b0ab-ece265617b53" containerName="kube-rbac-proxy" Apr 24 21:50:24.888332 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887881 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99ea2f55-4922-492d-a1dc-262944b17b3a" containerName="storage-initializer" Apr 24 21:50:24.888332 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.887887 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ea2f55-4922-492d-a1dc-262944b17b3a" containerName="storage-initializer" Apr 24 21:50:24.891064 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.891045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:24.893558 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.893531 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:50:24.893669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.893570 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:50:24.893669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.893586 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:50:24.893669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.893529 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 24 21:50:24.893669 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.893535 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 21:50:24.900017 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.899996 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf"] Apr 24 21:50:24.961726 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.961691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrx52\" (UniqueName: \"kubernetes.io/projected/d8361f7c-1569-46e0-b40b-a9087682cac2-kube-api-access-mrx52\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:24.961891 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.961739 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8361f7c-1569-46e0-b40b-a9087682cac2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:24.961891 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.961827 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8361f7c-1569-46e0-b40b-a9087682cac2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:24.961999 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:24.961912 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8361f7c-1569-46e0-b40b-a9087682cac2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.063303 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.063268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8361f7c-1569-46e0-b40b-a9087682cac2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.063480 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.063344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8361f7c-1569-46e0-b40b-a9087682cac2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.063480 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.063396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrx52\" (UniqueName: \"kubernetes.io/projected/d8361f7c-1569-46e0-b40b-a9087682cac2-kube-api-access-mrx52\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.063480 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.063434 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8361f7c-1569-46e0-b40b-a9087682cac2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.063855 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.063834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8361f7c-1569-46e0-b40b-a9087682cac2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.064051 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.064032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8361f7c-1569-46e0-b40b-a9087682cac2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.065723 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.065705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8361f7c-1569-46e0-b40b-a9087682cac2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.071384 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.071366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrx52\" (UniqueName: \"kubernetes.io/projected/d8361f7c-1569-46e0-b40b-a9087682cac2-kube-api-access-mrx52\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.201818 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.201796 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:25.324423 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:25.324391 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf"] Apr 24 21:50:25.327649 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:50:25.327612 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8361f7c_1569_46e0_b40b_a9087682cac2.slice/crio-2e23cdb5de451dac2de29751d10fc29a4682d404de6f00391524400d5654d228 WatchSource:0}: Error finding container 2e23cdb5de451dac2de29751d10fc29a4682d404de6f00391524400d5654d228: Status 404 returned error can't find the container with id 2e23cdb5de451dac2de29751d10fc29a4682d404de6f00391524400d5654d228 Apr 24 21:50:26.053824 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:26.053784 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" event={"ID":"d8361f7c-1569-46e0-b40b-a9087682cac2","Type":"ContainerStarted","Data":"739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b"} Apr 24 21:50:26.053824 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:26.053823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" event={"ID":"d8361f7c-1569-46e0-b40b-a9087682cac2","Type":"ContainerStarted","Data":"2e23cdb5de451dac2de29751d10fc29a4682d404de6f00391524400d5654d228"} Apr 24 21:50:30.069211 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:30.069178 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerID="739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b" exitCode=0 Apr 24 21:50:30.069574 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:30.069248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" event={"ID":"d8361f7c-1569-46e0-b40b-a9087682cac2","Type":"ContainerDied","Data":"739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b"} Apr 24 21:50:48.127708 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:48.127671 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" event={"ID":"d8361f7c-1569-46e0-b40b-a9087682cac2","Type":"ContainerStarted","Data":"8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba"} Apr 24 21:50:48.127708 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:48.127714 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" event={"ID":"d8361f7c-1569-46e0-b40b-a9087682cac2","Type":"ContainerStarted","Data":"34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9"} Apr 24 21:50:48.128285 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:48.127970 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:48.128285 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:48.127999 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:48.129417 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:48.129391 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:50:48.146025 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:48.145979 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podStartSLOduration=6.2342353169999996 podStartE2EDuration="24.145967634s" podCreationTimestamp="2026-04-24 21:50:24 +0000 UTC" firstStartedPulling="2026-04-24 21:50:30.070499677 +0000 UTC m=+2054.038301718" lastFinishedPulling="2026-04-24 21:50:47.982231978 +0000 UTC m=+2071.950034035" observedRunningTime="2026-04-24 21:50:48.144613237 +0000 UTC m=+2072.112415299" watchObservedRunningTime="2026-04-24 21:50:48.145967634 +0000 UTC m=+2072.113769751" Apr 24 21:50:49.130410 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:49.130373 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:50:54.134825 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:54.134793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:50:54.135427 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:50:54.135398 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:51:04.135979 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:04.135938 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:51:14.135450 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:14.135408 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:51:16.678556 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:16.678518 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:51:16.683767 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:16.683731 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:51:24.136092 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:24.136051 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:51:34.135979 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:34.135939 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:51:44.135894 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:44.135855 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:51:54.136075 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:54.136035 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:51:55.656418 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:51:55.656395 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:52:05.003011 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:05.002979 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf"] Apr 24 21:52:05.005522 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:05.003313 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" containerID="cri-o://34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9" gracePeriod=30 Apr 24 21:52:05.005522 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:05.003329 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kube-rbac-proxy" containerID="cri-o://8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba" gracePeriod=30 Apr 24 21:52:05.352978 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:05.352892 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerID="8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba" exitCode=2 Apr 24 21:52:05.353106 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:05.352975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" event={"ID":"d8361f7c-1569-46e0-b40b-a9087682cac2","Type":"ContainerDied","Data":"8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba"} Apr 24 21:52:05.656659 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:05.656619 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:52:08.935901 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.935880 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:52:08.986530 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.986472 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8361f7c-1569-46e0-b40b-a9087682cac2-kserve-provision-location\") pod \"d8361f7c-1569-46e0-b40b-a9087682cac2\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " Apr 24 21:52:08.986645 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.986539 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8361f7c-1569-46e0-b40b-a9087682cac2-proxy-tls\") pod \"d8361f7c-1569-46e0-b40b-a9087682cac2\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " Apr 24 21:52:08.986645 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.986558 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8361f7c-1569-46e0-b40b-a9087682cac2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"d8361f7c-1569-46e0-b40b-a9087682cac2\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " Apr 24 21:52:08.986645 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.986577 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrx52\" (UniqueName: \"kubernetes.io/projected/d8361f7c-1569-46e0-b40b-a9087682cac2-kube-api-access-mrx52\") pod \"d8361f7c-1569-46e0-b40b-a9087682cac2\" (UID: \"d8361f7c-1569-46e0-b40b-a9087682cac2\") " Apr 24 21:52:08.986881 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.986856 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8361f7c-1569-46e0-b40b-a9087682cac2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d8361f7c-1569-46e0-b40b-a9087682cac2" (UID: "d8361f7c-1569-46e0-b40b-a9087682cac2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:08.986936 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.986882 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8361f7c-1569-46e0-b40b-a9087682cac2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "d8361f7c-1569-46e0-b40b-a9087682cac2" (UID: "d8361f7c-1569-46e0-b40b-a9087682cac2"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:52:08.988611 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.988592 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8361f7c-1569-46e0-b40b-a9087682cac2-kube-api-access-mrx52" (OuterVolumeSpecName: "kube-api-access-mrx52") pod "d8361f7c-1569-46e0-b40b-a9087682cac2" (UID: "d8361f7c-1569-46e0-b40b-a9087682cac2"). InnerVolumeSpecName "kube-api-access-mrx52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:08.988690 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:08.988673 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8361f7c-1569-46e0-b40b-a9087682cac2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d8361f7c-1569-46e0-b40b-a9087682cac2" (UID: "d8361f7c-1569-46e0-b40b-a9087682cac2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:09.087691 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.087671 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8361f7c-1569-46e0-b40b-a9087682cac2-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:52:09.087691 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.087690 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8361f7c-1569-46e0-b40b-a9087682cac2-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:52:09.087837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.087700 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8361f7c-1569-46e0-b40b-a9087682cac2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:52:09.087837 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.087710 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrx52\" (UniqueName: \"kubernetes.io/projected/d8361f7c-1569-46e0-b40b-a9087682cac2-kube-api-access-mrx52\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:52:09.366997 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.366944 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerID="34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9" exitCode=0 Apr 24 21:52:09.367077 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.366998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" event={"ID":"d8361f7c-1569-46e0-b40b-a9087682cac2","Type":"ContainerDied","Data":"34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9"} Apr 24 21:52:09.367077 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.367017 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" Apr 24 21:52:09.367077 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.367033 2578 scope.go:117] "RemoveContainer" containerID="8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba" Apr 24 21:52:09.367190 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.367023 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf" event={"ID":"d8361f7c-1569-46e0-b40b-a9087682cac2","Type":"ContainerDied","Data":"2e23cdb5de451dac2de29751d10fc29a4682d404de6f00391524400d5654d228"} Apr 24 21:52:09.377503 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.377484 2578 scope.go:117] "RemoveContainer" containerID="34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9" Apr 24 21:52:09.384307 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.384290 2578 scope.go:117] "RemoveContainer" containerID="739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b" Apr 24 21:52:09.388868 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.388846 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf"] Apr 24 21:52:09.391405 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.391388 2578 scope.go:117] "RemoveContainer" containerID="8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba" Apr 24 21:52:09.391657 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:52:09.391628 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba\": container with ID starting with 8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba not found: ID does not exist" containerID="8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba" Apr 24 21:52:09.391810 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.391666 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba"} err="failed to get container status \"8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba\": rpc error: code = NotFound desc = could not find container \"8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba\": container with ID starting with 8980646af449442d6d6513e2511ac39c40373d68b238cdb3f66861df59ddbaba not found: ID does not exist" Apr 24 21:52:09.391810 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.391686 2578 scope.go:117] "RemoveContainer" containerID="34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9" Apr 24 21:52:09.392080 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:52:09.392045 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9\": container with ID starting with 34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9 not found: ID does not exist" containerID="34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9" Apr 24 21:52:09.392158 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.392074 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9"} err="failed to get container status \"34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9\": rpc error: code = NotFound desc = could not find container \"34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9\": container with ID starting with 34e9064aa63d1cfdbc8ab790815bfad99b3256ca95d83f94c6887aabe16185d9 not found: ID does not exist" Apr 24 21:52:09.392158 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.392098 2578 scope.go:117] "RemoveContainer" containerID="739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b" Apr 24 21:52:09.392455 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:52:09.392433 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b\": container with ID starting with 739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b not found: ID does not exist" containerID="739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b" Apr 24 21:52:09.392526 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.392464 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b"} err="failed to get container status \"739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b\": rpc error: code = NotFound desc = could not find container \"739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b\": container with ID starting with 739c17b210253286bee9b5facb16a859386a58060cc31b0a350d8400f68af02b not found: ID does not exist" Apr 24 21:52:09.393937 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:09.393916 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fg9nf"] Apr 24 21:52:10.659817 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:52:10.659787 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" path="/var/lib/kubelet/pods/d8361f7c-1569-46e0-b40b-a9087682cac2/volumes" Apr 24 21:53:35.436091 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436006 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd"] Apr 24 21:53:35.436608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436478 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="storage-initializer" Apr 24 21:53:35.436608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436498 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="storage-initializer" Apr 24 21:53:35.436608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436528 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" Apr 24 21:53:35.436608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436537 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" Apr 24 21:53:35.436608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436577 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kube-rbac-proxy" Apr 24 21:53:35.436608 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436586 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kube-rbac-proxy" Apr 24 21:53:35.437014 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436657 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kube-rbac-proxy" Apr 24 21:53:35.437014 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.436677 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8361f7c-1569-46e0-b40b-a9087682cac2" containerName="kserve-container" Apr 24 21:53:35.439942 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.439922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.442266 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.442245 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:53:35.442376 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.442261 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 24 21:53:35.442376 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.442278 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:53:35.442376 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.442316 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:53:35.442983 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.442964 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 21:53:35.450378 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.450359 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd"] Apr 24 21:53:35.498151 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.498125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a263606-6814-4b40-9cc5-497fd93550e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.498270 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.498161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.498319 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.498261 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a263606-6814-4b40-9cc5-497fd93550e6-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.498319 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.498294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxs77\" (UniqueName: \"kubernetes.io/projected/5a263606-6814-4b40-9cc5-497fd93550e6-kube-api-access-cxs77\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.599203 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.599184 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a263606-6814-4b40-9cc5-497fd93550e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.599351 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.599216 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.599351 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:53:35.599332 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-serving-cert: secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 24 21:53:35.599453 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.599355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a263606-6814-4b40-9cc5-497fd93550e6-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.599453 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.599386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxs77\" (UniqueName: \"kubernetes.io/projected/5a263606-6814-4b40-9cc5-497fd93550e6-kube-api-access-cxs77\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.599453 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:53:35.599402 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls podName:5a263606-6814-4b40-9cc5-497fd93550e6 nodeName:}" failed. No retries permitted until 2026-04-24 21:53:36.099379665 +0000 UTC m=+2240.067181708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls") pod "isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" (UID: "5a263606-6814-4b40-9cc5-497fd93550e6") : secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 24 21:53:35.599627 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.599585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a263606-6814-4b40-9cc5-497fd93550e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.600064 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.600040 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a263606-6814-4b40-9cc5-497fd93550e6-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:35.608526 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:35.608508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxs77\" (UniqueName: \"kubernetes.io/projected/5a263606-6814-4b40-9cc5-497fd93550e6-kube-api-access-cxs77\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:36.104279 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:36.104244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:36.106960 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:36.106938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:36.351641 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:36.351607 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:36.473634 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:36.473569 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd"] Apr 24 21:53:36.476013 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:53:36.475981 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a263606_6814_4b40_9cc5_497fd93550e6.slice/crio-8e7968fe1127fc3b4e70a80693da5f0d33f9a492c2b95bdbd4768d07655a99f0 WatchSource:0}: Error finding container 8e7968fe1127fc3b4e70a80693da5f0d33f9a492c2b95bdbd4768d07655a99f0: Status 404 returned error can't find the container with id 8e7968fe1127fc3b4e70a80693da5f0d33f9a492c2b95bdbd4768d07655a99f0 Apr 24 21:53:36.478306 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:36.478289 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:53:36.612902 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:36.612799 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" event={"ID":"5a263606-6814-4b40-9cc5-497fd93550e6","Type":"ContainerStarted","Data":"000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68"} Apr 24 21:53:36.612902 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:36.612846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" event={"ID":"5a263606-6814-4b40-9cc5-497fd93550e6","Type":"ContainerStarted","Data":"8e7968fe1127fc3b4e70a80693da5f0d33f9a492c2b95bdbd4768d07655a99f0"} Apr 24 21:53:40.624100 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:40.624012 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a263606-6814-4b40-9cc5-497fd93550e6" containerID="000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68" exitCode=0 Apr 24 21:53:40.624100 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:40.624060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" event={"ID":"5a263606-6814-4b40-9cc5-497fd93550e6","Type":"ContainerDied","Data":"000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68"} Apr 24 21:53:41.628967 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:41.628931 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" event={"ID":"5a263606-6814-4b40-9cc5-497fd93550e6","Type":"ContainerStarted","Data":"0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc"} Apr 24 21:53:41.628967 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:41.628972 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" event={"ID":"5a263606-6814-4b40-9cc5-497fd93550e6","Type":"ContainerStarted","Data":"ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219"} Apr 24 21:53:41.629393 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:41.629312 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:41.629393 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:41.629348 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:53:41.648010 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:41.647957 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" podStartSLOduration=6.647939324 podStartE2EDuration="6.647939324s" podCreationTimestamp="2026-04-24 21:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:53:41.645444537 +0000 UTC m=+2245.613246597" watchObservedRunningTime="2026-04-24 21:53:41.647939324 +0000 UTC m=+2245.615741388" Apr 24 21:53:47.636599 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:53:47.636569 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:54:17.637974 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:17.637940 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:54:27.637716 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:27.637677 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:54:37.637947 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:37.637908 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:54:47.720253 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:47.720223 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:54:55.463025 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:55.462942 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd"] Apr 24 21:54:55.463584 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:55.463363 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kserve-container" containerID="cri-o://ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219" gracePeriod=30 Apr 24 21:54:55.463584 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:55.463403 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kube-rbac-proxy" containerID="cri-o://0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc" gracePeriod=30 Apr 24 21:54:55.844206 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:55.844124 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a263606-6814-4b40-9cc5-497fd93550e6" containerID="0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc" exitCode=2 Apr 24 21:54:55.844344 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:55.844200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" event={"ID":"5a263606-6814-4b40-9cc5-497fd93550e6","Type":"ContainerDied","Data":"0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc"} Apr 24 21:54:57.633072 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:57.633033 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.49:8643/healthz\": dial tcp 10.134.0.49:8643: connect: connection refused" Apr 24 21:54:57.637435 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:57.637411 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:54:59.599788 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.599743 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:54:59.633769 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.633728 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls\") pod \"5a263606-6814-4b40-9cc5-497fd93550e6\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " Apr 24 21:54:59.633887 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.633831 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a263606-6814-4b40-9cc5-497fd93550e6-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"5a263606-6814-4b40-9cc5-497fd93550e6\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " Apr 24 21:54:59.633887 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.633858 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a263606-6814-4b40-9cc5-497fd93550e6-kserve-provision-location\") pod \"5a263606-6814-4b40-9cc5-497fd93550e6\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " Apr 24 21:54:59.633958 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.633930 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxs77\" (UniqueName: \"kubernetes.io/projected/5a263606-6814-4b40-9cc5-497fd93550e6-kube-api-access-cxs77\") pod \"5a263606-6814-4b40-9cc5-497fd93550e6\" (UID: \"5a263606-6814-4b40-9cc5-497fd93550e6\") " Apr 24 21:54:59.634203 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.634175 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a263606-6814-4b40-9cc5-497fd93550e6-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "5a263606-6814-4b40-9cc5-497fd93550e6" (UID: "5a263606-6814-4b40-9cc5-497fd93550e6"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:54:59.634311 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.634199 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a263606-6814-4b40-9cc5-497fd93550e6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5a263606-6814-4b40-9cc5-497fd93550e6" (UID: "5a263606-6814-4b40-9cc5-497fd93550e6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:59.635974 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.635940 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5a263606-6814-4b40-9cc5-497fd93550e6" (UID: "5a263606-6814-4b40-9cc5-497fd93550e6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:54:59.636074 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.636022 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a263606-6814-4b40-9cc5-497fd93550e6-kube-api-access-cxs77" (OuterVolumeSpecName: "kube-api-access-cxs77") pod "5a263606-6814-4b40-9cc5-497fd93550e6" (UID: "5a263606-6814-4b40-9cc5-497fd93550e6"). InnerVolumeSpecName "kube-api-access-cxs77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:54:59.735049 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.734988 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxs77\" (UniqueName: \"kubernetes.io/projected/5a263606-6814-4b40-9cc5-497fd93550e6-kube-api-access-cxs77\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:54:59.735049 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.735012 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a263606-6814-4b40-9cc5-497fd93550e6-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:54:59.735049 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.735026 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a263606-6814-4b40-9cc5-497fd93550e6-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:54:59.735049 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.735042 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a263606-6814-4b40-9cc5-497fd93550e6-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:54:59.857086 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.857056 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a263606-6814-4b40-9cc5-497fd93550e6" containerID="ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219" exitCode=0 Apr 24 21:54:59.857173 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.857097 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" event={"ID":"5a263606-6814-4b40-9cc5-497fd93550e6","Type":"ContainerDied","Data":"ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219"} Apr 24 21:54:59.857173 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.857119 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" event={"ID":"5a263606-6814-4b40-9cc5-497fd93550e6","Type":"ContainerDied","Data":"8e7968fe1127fc3b4e70a80693da5f0d33f9a492c2b95bdbd4768d07655a99f0"} Apr 24 21:54:59.857173 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.857133 2578 scope.go:117] "RemoveContainer" containerID="0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc" Apr 24 21:54:59.857173 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.857147 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd" Apr 24 21:54:59.865413 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.865394 2578 scope.go:117] "RemoveContainer" containerID="ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219" Apr 24 21:54:59.872724 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.872710 2578 scope.go:117] "RemoveContainer" containerID="000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68" Apr 24 21:54:59.877348 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.877329 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd"] Apr 24 21:54:59.879625 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.879610 2578 scope.go:117] "RemoveContainer" containerID="0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc" Apr 24 21:54:59.880003 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:54:59.879930 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc\": container with ID starting with 0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc not found: ID does not exist" containerID="0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc" Apr 24 21:54:59.880003 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.879969 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc"} err="failed to get container status \"0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc\": rpc error: code = NotFound desc = could not find container \"0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc\": container with ID starting with 0a81cb51ad4adc4ee4cfd3554f2397aa465ea468ad6e783fc419ed12c59c45bc not found: ID does not exist" Apr 24 21:54:59.880003 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.879993 2578 scope.go:117] "RemoveContainer" containerID="ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219" Apr 24 21:54:59.880302 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:54:59.880280 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219\": container with ID starting with ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219 not found: ID does not exist" containerID="ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219" Apr 24 21:54:59.880497 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.880381 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219"} err="failed to get container status \"ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219\": rpc error: code = NotFound desc = could not find container \"ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219\": container with ID starting with ec421a4ec7972f4aa2decd80d631ad5c37e1c1a3970985915c4c1898e6bd5219 not found: ID does not exist" Apr 24 21:54:59.880497 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.880409 2578 scope.go:117] "RemoveContainer" containerID="000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68" Apr 24 21:54:59.880690 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:54:59.880667 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68\": container with ID starting with 000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68 not found: ID does not exist" containerID="000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68" Apr 24 21:54:59.880779 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.880708 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68"} err="failed to get container status \"000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68\": rpc error: code = NotFound desc = could not find container \"000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68\": container with ID starting with 000653ddcb5906b41061bf3288df2ef0865860fcb205655d30b7b944bf94bd68 not found: ID does not exist" Apr 24 21:54:59.881602 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:54:59.881585 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-bh7xd"] Apr 24 21:55:00.660048 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:55:00.660014 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" path="/var/lib/kubelet/pods/5a263606-6814-4b40-9cc5-497fd93550e6/volumes" Apr 24 21:56:16.705246 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:56:16.705215 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:56:16.710283 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:56:16.710255 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 21:57:58.020628 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.020509 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh"] Apr 24 21:57:58.023239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.021033 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kube-rbac-proxy" Apr 24 21:57:58.023239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.021052 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kube-rbac-proxy" Apr 24 21:57:58.023239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.021077 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kserve-container" Apr 24 21:57:58.023239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.021087 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kserve-container" Apr 24 21:57:58.023239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.021098 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="storage-initializer" Apr 24 21:57:58.023239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.021108 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="storage-initializer" Apr 24 21:57:58.023239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.021179 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kserve-container" Apr 24 21:57:58.023239 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.021192 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a263606-6814-4b40-9cc5-497fd93550e6" containerName="kube-rbac-proxy" Apr 24 21:57:58.024186 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.024166 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.026568 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.026546 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 21:57:58.026568 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.026563 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:57:58.027314 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.027293 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:57:58.027371 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.027311 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 24 21:57:58.027371 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.027330 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:57:58.035571 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.035549 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh"] Apr 24 21:57:58.090165 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.090135 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c69d628e-481d-4007-88ac-7fd4908d1e7b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.090298 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.090185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbc9\" (UniqueName: \"kubernetes.io/projected/c69d628e-481d-4007-88ac-7fd4908d1e7b-kube-api-access-6vbc9\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.090298 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.090279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c69d628e-481d-4007-88ac-7fd4908d1e7b-proxy-tls\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.090374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.090314 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69d628e-481d-4007-88ac-7fd4908d1e7b-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.191169 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.191141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c69d628e-481d-4007-88ac-7fd4908d1e7b-proxy-tls\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.191280 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.191175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69d628e-481d-4007-88ac-7fd4908d1e7b-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.191280 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.191212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c69d628e-481d-4007-88ac-7fd4908d1e7b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.191280 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.191247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbc9\" (UniqueName: \"kubernetes.io/projected/c69d628e-481d-4007-88ac-7fd4908d1e7b-kube-api-access-6vbc9\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.191557 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.191538 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69d628e-481d-4007-88ac-7fd4908d1e7b-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.191886 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.191870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c69d628e-481d-4007-88ac-7fd4908d1e7b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.193586 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.193558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c69d628e-481d-4007-88ac-7fd4908d1e7b-proxy-tls\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.198847 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.198828 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbc9\" (UniqueName: \"kubernetes.io/projected/c69d628e-481d-4007-88ac-7fd4908d1e7b-kube-api-access-6vbc9\") pod \"isvc-sklearn-predictor-7b58d995d4-cs6zh\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.335475 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.335419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:57:58.452605 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:58.452583 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh"] Apr 24 21:57:58.455489 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:57:58.455458 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69d628e_481d_4007_88ac_7fd4908d1e7b.slice/crio-bb0ed6b4dfc59f82a01a81de2dd128de0bff2f5bb6af4c9becc5d7c8f6642cc7 WatchSource:0}: Error finding container bb0ed6b4dfc59f82a01a81de2dd128de0bff2f5bb6af4c9becc5d7c8f6642cc7: Status 404 returned error can't find the container with id bb0ed6b4dfc59f82a01a81de2dd128de0bff2f5bb6af4c9becc5d7c8f6642cc7 Apr 24 21:57:59.367981 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:59.367945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" event={"ID":"c69d628e-481d-4007-88ac-7fd4908d1e7b","Type":"ContainerStarted","Data":"ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780"} Apr 24 21:57:59.368333 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:57:59.367989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" event={"ID":"c69d628e-481d-4007-88ac-7fd4908d1e7b","Type":"ContainerStarted","Data":"bb0ed6b4dfc59f82a01a81de2dd128de0bff2f5bb6af4c9becc5d7c8f6642cc7"} Apr 24 21:58:03.380388 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:03.380358 2578 generic.go:358] "Generic (PLEG): container finished" podID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerID="ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780" exitCode=0 Apr 24 21:58:03.380773 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:03.380394 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" event={"ID":"c69d628e-481d-4007-88ac-7fd4908d1e7b","Type":"ContainerDied","Data":"ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780"} Apr 24 21:58:04.385473 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:04.385439 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" event={"ID":"c69d628e-481d-4007-88ac-7fd4908d1e7b","Type":"ContainerStarted","Data":"e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32"} Apr 24 21:58:04.385873 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:04.385482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" event={"ID":"c69d628e-481d-4007-88ac-7fd4908d1e7b","Type":"ContainerStarted","Data":"14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97"} Apr 24 21:58:04.385873 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:04.385697 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:58:04.406634 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:04.406593 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podStartSLOduration=6.406579989 podStartE2EDuration="6.406579989s" podCreationTimestamp="2026-04-24 21:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:58:04.404262384 +0000 UTC m=+2508.372064446" watchObservedRunningTime="2026-04-24 21:58:04.406579989 +0000 UTC m=+2508.374382050" Apr 24 21:58:05.388312 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:05.388283 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:58:05.389421 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:05.389395 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:06.390570 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:06.390529 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:11.394242 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:11.394205 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:58:11.394865 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:11.394838 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:21.394708 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:21.394671 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:31.395552 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:31.395513 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:41.394930 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:41.394894 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:51.394736 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:58:51.394693 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:59:01.395713 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:01.395685 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:59:08.120281 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.120249 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh"] Apr 24 21:59:08.120890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.120564 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" containerID="cri-o://14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97" gracePeriod=30 Apr 24 21:59:08.120890 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.120628 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kube-rbac-proxy" containerID="cri-o://e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32" gracePeriod=30 Apr 24 21:59:08.196386 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.196349 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq"] Apr 24 21:59:08.200179 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.200159 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.202461 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.202443 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 24 21:59:08.202569 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.202471 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 21:59:08.210092 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.210069 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq"] Apr 24 21:59:08.301501 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.301476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9522c87-c436-4251-af15-cfbc05d623bf-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.301595 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.301513 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9522c87-c436-4251-af15-cfbc05d623bf-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.301640 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.301611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqftx\" (UniqueName: \"kubernetes.io/projected/f9522c87-c436-4251-af15-cfbc05d623bf-kube-api-access-gqftx\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.301704 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.301687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.402671 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.402645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqftx\" (UniqueName: \"kubernetes.io/projected/f9522c87-c436-4251-af15-cfbc05d623bf-kube-api-access-gqftx\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.402790 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.402688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.402790 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.402719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9522c87-c436-4251-af15-cfbc05d623bf-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.402790 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.402741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9522c87-c436-4251-af15-cfbc05d623bf-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.402923 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:59:08.402862 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-serving-cert: secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 24 21:59:08.402974 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:59:08.402930 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls podName:f9522c87-c436-4251-af15-cfbc05d623bf nodeName:}" failed. No retries permitted until 2026-04-24 21:59:08.902913908 +0000 UTC m=+2572.870715949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls") pod "sklearn-v2-mlserver-predictor-65d8664766-prnmq" (UID: "f9522c87-c436-4251-af15-cfbc05d623bf") : secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 24 21:59:08.403065 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.403049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9522c87-c436-4251-af15-cfbc05d623bf-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.403340 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.403323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9522c87-c436-4251-af15-cfbc05d623bf-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.411538 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.411511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqftx\" (UniqueName: \"kubernetes.io/projected/f9522c87-c436-4251-af15-cfbc05d623bf-kube-api-access-gqftx\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.566422 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.566394 2578 generic.go:358] "Generic (PLEG): container finished" podID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerID="e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32" exitCode=2 Apr 24 21:59:08.566531 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.566465 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" event={"ID":"c69d628e-481d-4007-88ac-7fd4908d1e7b","Type":"ContainerDied","Data":"e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32"} Apr 24 21:59:08.906830 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.906804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:08.909126 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:08.909101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-prnmq\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:09.110719 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:09.110684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:09.231260 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:09.231191 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq"] Apr 24 21:59:09.233988 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:59:09.233948 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9522c87_c436_4251_af15_cfbc05d623bf.slice/crio-3ff0979250677bea3c391e5201858083d9fd044787ff238e34730302b31bcdfc WatchSource:0}: Error finding container 3ff0979250677bea3c391e5201858083d9fd044787ff238e34730302b31bcdfc: Status 404 returned error can't find the container with id 3ff0979250677bea3c391e5201858083d9fd044787ff238e34730302b31bcdfc Apr 24 21:59:09.235974 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:09.235956 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:59:09.571002 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:09.570908 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" event={"ID":"f9522c87-c436-4251-af15-cfbc05d623bf","Type":"ContainerStarted","Data":"5afadf0837ab6f5b907d1bc6d882dff6a7f5bd81b92b61b8929bcc4578ef99eb"} Apr 24 21:59:09.571002 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:09.570958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" event={"ID":"f9522c87-c436-4251-af15-cfbc05d623bf","Type":"ContainerStarted","Data":"3ff0979250677bea3c391e5201858083d9fd044787ff238e34730302b31bcdfc"} Apr 24 21:59:11.391374 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.391337 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 21:59:11.395665 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.395641 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:59:11.772533 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.772513 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:59:11.827378 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.827357 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vbc9\" (UniqueName: \"kubernetes.io/projected/c69d628e-481d-4007-88ac-7fd4908d1e7b-kube-api-access-6vbc9\") pod \"c69d628e-481d-4007-88ac-7fd4908d1e7b\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " Apr 24 21:59:11.827464 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.827392 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c69d628e-481d-4007-88ac-7fd4908d1e7b-proxy-tls\") pod \"c69d628e-481d-4007-88ac-7fd4908d1e7b\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " Apr 24 21:59:11.827464 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.827446 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c69d628e-481d-4007-88ac-7fd4908d1e7b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"c69d628e-481d-4007-88ac-7fd4908d1e7b\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " Apr 24 21:59:11.827546 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.827469 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69d628e-481d-4007-88ac-7fd4908d1e7b-kserve-provision-location\") pod \"c69d628e-481d-4007-88ac-7fd4908d1e7b\" (UID: \"c69d628e-481d-4007-88ac-7fd4908d1e7b\") " Apr 24 21:59:11.827834 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.827803 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69d628e-481d-4007-88ac-7fd4908d1e7b-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "c69d628e-481d-4007-88ac-7fd4908d1e7b" (UID: "c69d628e-481d-4007-88ac-7fd4908d1e7b"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:11.827834 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.827809 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69d628e-481d-4007-88ac-7fd4908d1e7b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c69d628e-481d-4007-88ac-7fd4908d1e7b" (UID: "c69d628e-481d-4007-88ac-7fd4908d1e7b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:59:11.829549 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.829526 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69d628e-481d-4007-88ac-7fd4908d1e7b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c69d628e-481d-4007-88ac-7fd4908d1e7b" (UID: "c69d628e-481d-4007-88ac-7fd4908d1e7b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:59:11.829638 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.829574 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69d628e-481d-4007-88ac-7fd4908d1e7b-kube-api-access-6vbc9" (OuterVolumeSpecName: "kube-api-access-6vbc9") pod "c69d628e-481d-4007-88ac-7fd4908d1e7b" (UID: "c69d628e-481d-4007-88ac-7fd4908d1e7b"). InnerVolumeSpecName "kube-api-access-6vbc9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:59:11.928486 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.928464 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c69d628e-481d-4007-88ac-7fd4908d1e7b-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:59:11.928486 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.928485 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c69d628e-481d-4007-88ac-7fd4908d1e7b-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:59:11.928615 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.928494 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vbc9\" (UniqueName: \"kubernetes.io/projected/c69d628e-481d-4007-88ac-7fd4908d1e7b-kube-api-access-6vbc9\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:59:11.928615 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:11.928506 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c69d628e-481d-4007-88ac-7fd4908d1e7b-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 21:59:12.583813 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.583777 2578 generic.go:358] "Generic (PLEG): container finished" podID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerID="14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97" exitCode=0 Apr 24 21:59:12.584202 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.583833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" event={"ID":"c69d628e-481d-4007-88ac-7fd4908d1e7b","Type":"ContainerDied","Data":"14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97"} Apr 24 21:59:12.584202 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.583867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" event={"ID":"c69d628e-481d-4007-88ac-7fd4908d1e7b","Type":"ContainerDied","Data":"bb0ed6b4dfc59f82a01a81de2dd128de0bff2f5bb6af4c9becc5d7c8f6642cc7"} Apr 24 21:59:12.584202 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.583884 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh" Apr 24 21:59:12.584202 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.583888 2578 scope.go:117] "RemoveContainer" containerID="e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32" Apr 24 21:59:12.592834 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.592808 2578 scope.go:117] "RemoveContainer" containerID="14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97" Apr 24 21:59:12.599639 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.599621 2578 scope.go:117] "RemoveContainer" containerID="ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780" Apr 24 21:59:12.606803 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.606781 2578 scope.go:117] "RemoveContainer" containerID="e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32" Apr 24 21:59:12.607071 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:59:12.607048 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32\": container with ID starting with e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32 not found: ID does not exist" containerID="e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32" Apr 24 21:59:12.607135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.607081 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32"} err="failed to get container status \"e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32\": rpc error: code = NotFound desc = could not find container \"e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32\": container with ID starting with e2d5855eddfd4e334c0f20140f4667c488446d81d98b30882599be21e9dc1a32 not found: ID does not exist" Apr 24 21:59:12.607135 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.607105 2578 scope.go:117] "RemoveContainer" containerID="14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97" Apr 24 21:59:12.607428 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:59:12.607394 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97\": container with ID starting with 14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97 not found: ID does not exist" containerID="14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97" Apr 24 21:59:12.607489 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.607437 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97"} err="failed to get container status \"14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97\": rpc error: code = NotFound desc = could not find container \"14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97\": container with ID starting with 14c05939f6f32c09a75d978c3d1cd2cda62dcd897860eb21ff4e7791ecadfe97 not found: ID does not exist" Apr 24 21:59:12.607489 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.607459 2578 scope.go:117] "RemoveContainer" containerID="ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780" Apr 24 21:59:12.607704 ip-10-0-134-248 kubenswrapper[2578]: E0424 21:59:12.607684 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780\": container with ID starting with ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780 not found: ID does not exist" containerID="ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780" Apr 24 21:59:12.607765 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.607709 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780"} err="failed to get container status \"ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780\": rpc error: code = NotFound desc = could not find container \"ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780\": container with ID starting with ee0ad6b3269c4e3c0c9f3d24ec71067716e761cd1355ae04af78b68ae4a62780 not found: ID does not exist" Apr 24 21:59:12.607944 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.607926 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh"] Apr 24 21:59:12.611883 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.611864 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b58d995d4-cs6zh"] Apr 24 21:59:12.659721 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:12.659700 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" path="/var/lib/kubelet/pods/c69d628e-481d-4007-88ac-7fd4908d1e7b/volumes" Apr 24 21:59:13.588291 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:13.588261 2578 generic.go:358] "Generic (PLEG): container finished" podID="f9522c87-c436-4251-af15-cfbc05d623bf" containerID="5afadf0837ab6f5b907d1bc6d882dff6a7f5bd81b92b61b8929bcc4578ef99eb" exitCode=0 Apr 24 21:59:13.588671 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:13.588337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" event={"ID":"f9522c87-c436-4251-af15-cfbc05d623bf","Type":"ContainerDied","Data":"5afadf0837ab6f5b907d1bc6d882dff6a7f5bd81b92b61b8929bcc4578ef99eb"} Apr 24 21:59:14.594850 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:14.594813 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" event={"ID":"f9522c87-c436-4251-af15-cfbc05d623bf","Type":"ContainerStarted","Data":"f37f5e470dd62f48f99a0de646c7185b91e39690d794f84f18dd513d158ced94"} Apr 24 21:59:14.594850 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:14.594855 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" event={"ID":"f9522c87-c436-4251-af15-cfbc05d623bf","Type":"ContainerStarted","Data":"5a49e84502fa1aebf84ceddceeee5b20fb7334fd73d14e2dfcd9cb505e5122ec"} Apr 24 21:59:14.595347 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:14.595087 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:14.616858 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:14.616810 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" podStartSLOduration=6.616798484 podStartE2EDuration="6.616798484s" podCreationTimestamp="2026-04-24 21:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:59:14.614580165 +0000 UTC m=+2578.582382226" watchObservedRunningTime="2026-04-24 21:59:14.616798484 +0000 UTC m=+2578.584600545" Apr 24 21:59:15.598281 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:15.598256 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:21.606878 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:21.606848 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:51.610489 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:51.610459 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 21:59:58.308506 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.308468 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq"] Apr 24 21:59:58.308920 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.308887 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kube-rbac-proxy" containerID="cri-o://f37f5e470dd62f48f99a0de646c7185b91e39690d794f84f18dd513d158ced94" gracePeriod=30 Apr 24 21:59:58.309028 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.308867 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kserve-container" containerID="cri-o://5a49e84502fa1aebf84ceddceeee5b20fb7334fd73d14e2dfcd9cb505e5122ec" gracePeriod=30 Apr 24 21:59:58.424054 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424024 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq"] Apr 24 21:59:58.424430 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424412 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kube-rbac-proxy" Apr 24 21:59:58.424516 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424433 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kube-rbac-proxy" Apr 24 21:59:58.424516 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424446 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" Apr 24 21:59:58.424516 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424454 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" Apr 24 21:59:58.424516 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424486 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="storage-initializer" Apr 24 21:59:58.424516 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424495 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="storage-initializer" Apr 24 21:59:58.424788 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424579 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kserve-container" Apr 24 21:59:58.424788 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.424597 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c69d628e-481d-4007-88ac-7fd4908d1e7b" containerName="kube-rbac-proxy" Apr 24 21:59:58.431352 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.431319 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.435344 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.435324 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 24 21:59:58.435452 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.435331 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:59:58.444550 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.444526 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq"] Apr 24 21:59:58.565509 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.565431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/21c615f8-7088-4a30-b75d-3052e78aa031-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.565509 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.565469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq5d8\" (UniqueName: \"kubernetes.io/projected/21c615f8-7088-4a30-b75d-3052e78aa031-kube-api-access-mq5d8\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.565686 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.565515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21c615f8-7088-4a30-b75d-3052e78aa031-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.565686 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.565547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c615f8-7088-4a30-b75d-3052e78aa031-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.666046 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.666019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c615f8-7088-4a30-b75d-3052e78aa031-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.666169 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.666084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/21c615f8-7088-4a30-b75d-3052e78aa031-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.666169 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.666109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mq5d8\" (UniqueName: \"kubernetes.io/projected/21c615f8-7088-4a30-b75d-3052e78aa031-kube-api-access-mq5d8\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.666169 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.666154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21c615f8-7088-4a30-b75d-3052e78aa031-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.666489 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.666466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c615f8-7088-4a30-b75d-3052e78aa031-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.666741 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.666723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/21c615f8-7088-4a30-b75d-3052e78aa031-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.668665 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.668643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21c615f8-7088-4a30-b75d-3052e78aa031-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.674040 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.674022 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq5d8\" (UniqueName: \"kubernetes.io/projected/21c615f8-7088-4a30-b75d-3052e78aa031-kube-api-access-mq5d8\") pod \"isvc-sklearn-runtime-predictor-5747956474-9w8pq\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.719937 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.719912 2578 generic.go:358] "Generic (PLEG): container finished" podID="f9522c87-c436-4251-af15-cfbc05d623bf" containerID="f37f5e470dd62f48f99a0de646c7185b91e39690d794f84f18dd513d158ced94" exitCode=2 Apr 24 21:59:58.720046 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.719956 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" event={"ID":"f9522c87-c436-4251-af15-cfbc05d623bf","Type":"ContainerDied","Data":"f37f5e470dd62f48f99a0de646c7185b91e39690d794f84f18dd513d158ced94"} Apr 24 21:59:58.741064 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.741038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 21:59:58.861285 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:58.861257 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq"] Apr 24 21:59:58.863722 ip-10-0-134-248 kubenswrapper[2578]: W0424 21:59:58.863695 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c615f8_7088_4a30_b75d_3052e78aa031.slice/crio-1e62003ddbe13beb0ba4fba399fae60fc18c0bf54c80ff11bc3bb4f80792a3ba WatchSource:0}: Error finding container 1e62003ddbe13beb0ba4fba399fae60fc18c0bf54c80ff11bc3bb4f80792a3ba: Status 404 returned error can't find the container with id 1e62003ddbe13beb0ba4fba399fae60fc18c0bf54c80ff11bc3bb4f80792a3ba Apr 24 21:59:59.724162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:59.724124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" event={"ID":"21c615f8-7088-4a30-b75d-3052e78aa031","Type":"ContainerStarted","Data":"a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5"} Apr 24 21:59:59.724162 ip-10-0-134-248 kubenswrapper[2578]: I0424 21:59:59.724164 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" event={"ID":"21c615f8-7088-4a30-b75d-3052e78aa031","Type":"ContainerStarted","Data":"1e62003ddbe13beb0ba4fba399fae60fc18c0bf54c80ff11bc3bb4f80792a3ba"} Apr 24 22:00:01.602149 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:01.602114 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 22:00:01.608287 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:01.608264 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.51:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 22:00:04.739907 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:04.739876 2578 generic.go:358] "Generic (PLEG): container finished" podID="f9522c87-c436-4251-af15-cfbc05d623bf" containerID="5a49e84502fa1aebf84ceddceeee5b20fb7334fd73d14e2dfcd9cb505e5122ec" exitCode=0 Apr 24 22:00:04.740231 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:04.739933 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" event={"ID":"f9522c87-c436-4251-af15-cfbc05d623bf","Type":"ContainerDied","Data":"5a49e84502fa1aebf84ceddceeee5b20fb7334fd73d14e2dfcd9cb505e5122ec"} Apr 24 22:00:04.741262 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:04.741240 2578 generic.go:358] "Generic (PLEG): container finished" podID="21c615f8-7088-4a30-b75d-3052e78aa031" containerID="a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5" exitCode=0 Apr 24 22:00:04.741347 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:04.741309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" event={"ID":"21c615f8-7088-4a30-b75d-3052e78aa031","Type":"ContainerDied","Data":"a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5"} Apr 24 22:00:04.870772 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:04.870728 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 22:00:05.019646 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.019610 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9522c87-c436-4251-af15-cfbc05d623bf-kserve-provision-location\") pod \"f9522c87-c436-4251-af15-cfbc05d623bf\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " Apr 24 22:00:05.019856 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.019677 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqftx\" (UniqueName: \"kubernetes.io/projected/f9522c87-c436-4251-af15-cfbc05d623bf-kube-api-access-gqftx\") pod \"f9522c87-c436-4251-af15-cfbc05d623bf\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " Apr 24 22:00:05.019856 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.019734 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9522c87-c436-4251-af15-cfbc05d623bf-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"f9522c87-c436-4251-af15-cfbc05d623bf\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " Apr 24 22:00:05.019856 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.019777 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls\") pod \"f9522c87-c436-4251-af15-cfbc05d623bf\" (UID: \"f9522c87-c436-4251-af15-cfbc05d623bf\") " Apr 24 22:00:05.020032 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.019962 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9522c87-c436-4251-af15-cfbc05d623bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f9522c87-c436-4251-af15-cfbc05d623bf" (UID: "f9522c87-c436-4251-af15-cfbc05d623bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:05.020291 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.020095 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9522c87-c436-4251-af15-cfbc05d623bf-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "f9522c87-c436-4251-af15-cfbc05d623bf" (UID: "f9522c87-c436-4251-af15-cfbc05d623bf"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:00:05.022172 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.022145 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9522c87-c436-4251-af15-cfbc05d623bf-kube-api-access-gqftx" (OuterVolumeSpecName: "kube-api-access-gqftx") pod "f9522c87-c436-4251-af15-cfbc05d623bf" (UID: "f9522c87-c436-4251-af15-cfbc05d623bf"). InnerVolumeSpecName "kube-api-access-gqftx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:05.022395 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.022371 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f9522c87-c436-4251-af15-cfbc05d623bf" (UID: "f9522c87-c436-4251-af15-cfbc05d623bf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:05.121336 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.121302 2578 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9522c87-c436-4251-af15-cfbc05d623bf-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:00:05.121433 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.121339 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9522c87-c436-4251-af15-cfbc05d623bf-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:00:05.121433 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.121354 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9522c87-c436-4251-af15-cfbc05d623bf-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:00:05.121433 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.121369 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqftx\" (UniqueName: \"kubernetes.io/projected/f9522c87-c436-4251-af15-cfbc05d623bf-kube-api-access-gqftx\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:00:05.745892 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.745853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" event={"ID":"f9522c87-c436-4251-af15-cfbc05d623bf","Type":"ContainerDied","Data":"3ff0979250677bea3c391e5201858083d9fd044787ff238e34730302b31bcdfc"} Apr 24 22:00:05.745892 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.745891 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq" Apr 24 22:00:05.746381 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.745900 2578 scope.go:117] "RemoveContainer" containerID="f37f5e470dd62f48f99a0de646c7185b91e39690d794f84f18dd513d158ced94" Apr 24 22:00:05.748211 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.748186 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" event={"ID":"21c615f8-7088-4a30-b75d-3052e78aa031","Type":"ContainerStarted","Data":"aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707"} Apr 24 22:00:05.748326 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.748225 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" event={"ID":"21c615f8-7088-4a30-b75d-3052e78aa031","Type":"ContainerStarted","Data":"407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83"} Apr 24 22:00:05.748481 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.748460 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 22:00:05.748527 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.748493 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 22:00:05.749872 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.749834 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 22:00:05.755572 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.755555 2578 scope.go:117] "RemoveContainer" containerID="5a49e84502fa1aebf84ceddceeee5b20fb7334fd73d14e2dfcd9cb505e5122ec" Apr 24 22:00:05.763278 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.763261 2578 scope.go:117] "RemoveContainer" containerID="5afadf0837ab6f5b907d1bc6d882dff6a7f5bd81b92b61b8929bcc4578ef99eb" Apr 24 22:00:05.783345 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.783310 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" podStartSLOduration=7.783298402 podStartE2EDuration="7.783298402s" podCreationTimestamp="2026-04-24 21:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:00:05.780622893 +0000 UTC m=+2629.748424955" watchObservedRunningTime="2026-04-24 22:00:05.783298402 +0000 UTC m=+2629.751100648" Apr 24 22:00:05.796209 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.796187 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq"] Apr 24 22:00:05.799631 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:05.799598 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-prnmq"] Apr 24 22:00:06.662679 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:06.662638 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" path="/var/lib/kubelet/pods/f9522c87-c436-4251-af15-cfbc05d623bf/volumes" Apr 24 22:00:06.752301 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:06.752262 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 22:00:11.756622 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:11.756548 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 22:00:11.757160 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:11.757131 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 22:00:21.757868 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:21.757837 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 22:00:35.308254 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.308223 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-5747956474-9w8pq_21c615f8-7088-4a30-b75d-3052e78aa031/kserve-container/0.log" Apr 24 22:00:35.440262 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.440233 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq"] Apr 24 22:00:35.440554 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.440526 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kserve-container" containerID="cri-o://407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83" gracePeriod=30 Apr 24 22:00:35.440627 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.440563 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kube-rbac-proxy" containerID="cri-o://aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707" gracePeriod=30 Apr 24 22:00:35.566311 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566242 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p"] Apr 24 22:00:35.566603 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566589 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="storage-initializer" Apr 24 22:00:35.566656 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566605 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="storage-initializer" Apr 24 22:00:35.566656 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566617 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kserve-container" Apr 24 22:00:35.566656 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566623 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kserve-container" Apr 24 22:00:35.566656 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566636 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kube-rbac-proxy" Apr 24 22:00:35.566656 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566642 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kube-rbac-proxy" Apr 24 22:00:35.566850 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566700 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kube-rbac-proxy" Apr 24 22:00:35.566850 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.566708 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9522c87-c436-4251-af15-cfbc05d623bf" containerName="kserve-container" Apr 24 22:00:35.569877 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.569861 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.572566 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.572546 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:00:35.572673 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.572568 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 24 22:00:35.588800 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.588730 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p"] Apr 24 22:00:35.646287 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.646255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d606011-6522-4a74-a075-07fd69dccd79-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.646443 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.646298 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d606011-6522-4a74-a075-07fd69dccd79-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.646443 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.646324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d606011-6522-4a74-a075-07fd69dccd79-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.646443 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.646348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rcr\" (UniqueName: \"kubernetes.io/projected/1d606011-6522-4a74-a075-07fd69dccd79-kube-api-access-69rcr\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.747486 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.747452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d606011-6522-4a74-a075-07fd69dccd79-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.747486 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.747500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d606011-6522-4a74-a075-07fd69dccd79-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.747733 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.747522 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d606011-6522-4a74-a075-07fd69dccd79-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.747733 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.747537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69rcr\" (UniqueName: \"kubernetes.io/projected/1d606011-6522-4a74-a075-07fd69dccd79-kube-api-access-69rcr\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.747982 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.747960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d606011-6522-4a74-a075-07fd69dccd79-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.748359 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.748337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d606011-6522-4a74-a075-07fd69dccd79-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.750096 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.750079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d606011-6522-4a74-a075-07fd69dccd79-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.756363 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.756337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69rcr\" (UniqueName: \"kubernetes.io/projected/1d606011-6522-4a74-a075-07fd69dccd79-kube-api-access-69rcr\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:35.848538 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.848453 2578 generic.go:358] "Generic (PLEG): container finished" podID="21c615f8-7088-4a30-b75d-3052e78aa031" containerID="aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707" exitCode=2 Apr 24 22:00:35.848673 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.848534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" event={"ID":"21c615f8-7088-4a30-b75d-3052e78aa031","Type":"ContainerDied","Data":"aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707"} Apr 24 22:00:35.879918 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:35.879882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:36.095440 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.095412 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p"] Apr 24 22:00:36.105735 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:00:36.105679 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d606011_6522_4a74_a075_07fd69dccd79.slice/crio-63ca53a08b05345303119c721c5afb079c9d1e62df973df091899890070869a4 WatchSource:0}: Error finding container 63ca53a08b05345303119c721c5afb079c9d1e62df973df091899890070869a4: Status 404 returned error can't find the container with id 63ca53a08b05345303119c721c5afb079c9d1e62df973df091899890070869a4 Apr 24 22:00:36.270310 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.270287 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 22:00:36.352035 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.352002 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/21c615f8-7088-4a30-b75d-3052e78aa031-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"21c615f8-7088-4a30-b75d-3052e78aa031\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " Apr 24 22:00:36.352035 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.352040 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq5d8\" (UniqueName: \"kubernetes.io/projected/21c615f8-7088-4a30-b75d-3052e78aa031-kube-api-access-mq5d8\") pod \"21c615f8-7088-4a30-b75d-3052e78aa031\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " Apr 24 22:00:36.352498 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.352070 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21c615f8-7088-4a30-b75d-3052e78aa031-proxy-tls\") pod \"21c615f8-7088-4a30-b75d-3052e78aa031\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " Apr 24 22:00:36.352498 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.352095 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c615f8-7088-4a30-b75d-3052e78aa031-kserve-provision-location\") pod \"21c615f8-7088-4a30-b75d-3052e78aa031\" (UID: \"21c615f8-7088-4a30-b75d-3052e78aa031\") " Apr 24 22:00:36.352498 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.352396 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c615f8-7088-4a30-b75d-3052e78aa031-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "21c615f8-7088-4a30-b75d-3052e78aa031" (UID: "21c615f8-7088-4a30-b75d-3052e78aa031"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:00:36.354701 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.354668 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c615f8-7088-4a30-b75d-3052e78aa031-kube-api-access-mq5d8" (OuterVolumeSpecName: "kube-api-access-mq5d8") pod "21c615f8-7088-4a30-b75d-3052e78aa031" (UID: "21c615f8-7088-4a30-b75d-3052e78aa031"). InnerVolumeSpecName "kube-api-access-mq5d8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:36.354701 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.354669 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c615f8-7088-4a30-b75d-3052e78aa031-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "21c615f8-7088-4a30-b75d-3052e78aa031" (UID: "21c615f8-7088-4a30-b75d-3052e78aa031"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:36.356590 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.356550 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c615f8-7088-4a30-b75d-3052e78aa031-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "21c615f8-7088-4a30-b75d-3052e78aa031" (UID: "21c615f8-7088-4a30-b75d-3052e78aa031"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:36.452899 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.452850 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/21c615f8-7088-4a30-b75d-3052e78aa031-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:00:36.452899 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.452884 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mq5d8\" (UniqueName: \"kubernetes.io/projected/21c615f8-7088-4a30-b75d-3052e78aa031-kube-api-access-mq5d8\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:00:36.452899 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.452899 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21c615f8-7088-4a30-b75d-3052e78aa031-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:00:36.453079 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.452914 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c615f8-7088-4a30-b75d-3052e78aa031-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:00:36.852650 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.852604 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" event={"ID":"1d606011-6522-4a74-a075-07fd69dccd79","Type":"ContainerStarted","Data":"679c1b93d0429dd845aee6a60277b288ea9157e26b05321b31886c68e6509a89"} Apr 24 22:00:36.852650 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.852652 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" event={"ID":"1d606011-6522-4a74-a075-07fd69dccd79","Type":"ContainerStarted","Data":"63ca53a08b05345303119c721c5afb079c9d1e62df973df091899890070869a4"} Apr 24 22:00:36.854097 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.854072 2578 generic.go:358] "Generic (PLEG): container finished" podID="21c615f8-7088-4a30-b75d-3052e78aa031" containerID="407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83" exitCode=0 Apr 24 22:00:36.854213 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.854118 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" event={"ID":"21c615f8-7088-4a30-b75d-3052e78aa031","Type":"ContainerDied","Data":"407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83"} Apr 24 22:00:36.854213 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.854136 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" event={"ID":"21c615f8-7088-4a30-b75d-3052e78aa031","Type":"ContainerDied","Data":"1e62003ddbe13beb0ba4fba399fae60fc18c0bf54c80ff11bc3bb4f80792a3ba"} Apr 24 22:00:36.854213 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.854152 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq" Apr 24 22:00:36.854213 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.854157 2578 scope.go:117] "RemoveContainer" containerID="aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707" Apr 24 22:00:36.861833 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.861818 2578 scope.go:117] "RemoveContainer" containerID="407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83" Apr 24 22:00:36.868911 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.868897 2578 scope.go:117] "RemoveContainer" containerID="a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5" Apr 24 22:00:36.876345 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.876328 2578 scope.go:117] "RemoveContainer" containerID="aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707" Apr 24 22:00:36.876587 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:00:36.876567 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707\": container with ID starting with aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707 not found: ID does not exist" containerID="aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707" Apr 24 22:00:36.876633 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.876599 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707"} err="failed to get container status \"aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707\": rpc error: code = NotFound desc = could not find container \"aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707\": container with ID starting with aa8765d52d0c1688e72314a5227a51e2f48ab7d02b577c946d3e1187077de707 not found: ID does not exist" Apr 24 22:00:36.876633 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.876618 2578 scope.go:117] "RemoveContainer" containerID="407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83" Apr 24 22:00:36.876872 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:00:36.876853 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83\": container with ID starting with 407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83 not found: ID does not exist" containerID="407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83" Apr 24 22:00:36.876919 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.876879 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83"} err="failed to get container status \"407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83\": rpc error: code = NotFound desc = could not find container \"407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83\": container with ID starting with 407ec2614310e952c9d2df5f8bf1815951f02fffdd1abaa096a97d284e8eeb83 not found: ID does not exist" Apr 24 22:00:36.876919 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.876899 2578 scope.go:117] "RemoveContainer" containerID="a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5" Apr 24 22:00:36.877092 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:00:36.877077 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5\": container with ID starting with a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5 not found: ID does not exist" containerID="a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5" Apr 24 22:00:36.877133 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.877097 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5"} err="failed to get container status \"a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5\": rpc error: code = NotFound desc = could not find container \"a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5\": container with ID starting with a94c2d3d56d8086ac03ecc821512d115fd693bfb4736065431d88c4976d3faf5 not found: ID does not exist" Apr 24 22:00:36.890307 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.890281 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq"] Apr 24 22:00:36.895345 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:36.895324 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5747956474-9w8pq"] Apr 24 22:00:38.659590 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:38.659549 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" path="/var/lib/kubelet/pods/21c615f8-7088-4a30-b75d-3052e78aa031/volumes" Apr 24 22:00:40.868965 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:40.868935 2578 generic.go:358] "Generic (PLEG): container finished" podID="1d606011-6522-4a74-a075-07fd69dccd79" containerID="679c1b93d0429dd845aee6a60277b288ea9157e26b05321b31886c68e6509a89" exitCode=0 Apr 24 22:00:40.869291 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:40.868985 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" event={"ID":"1d606011-6522-4a74-a075-07fd69dccd79","Type":"ContainerDied","Data":"679c1b93d0429dd845aee6a60277b288ea9157e26b05321b31886c68e6509a89"} Apr 24 22:00:41.874046 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:41.874013 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" event={"ID":"1d606011-6522-4a74-a075-07fd69dccd79","Type":"ContainerStarted","Data":"1f3e7fbbbdea0953c02f2722ecd1bf335f70bdf83603da2ce8c7de03493d8f6f"} Apr 24 22:00:41.874046 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:41.874051 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" event={"ID":"1d606011-6522-4a74-a075-07fd69dccd79","Type":"ContainerStarted","Data":"8835817856b280c7c14f64f0605420125d77a4c2340d0dc3fe7b492f8f689729"} Apr 24 22:00:41.874428 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:41.874241 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:41.896579 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:41.896539 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" podStartSLOduration=6.896526239 podStartE2EDuration="6.896526239s" podCreationTimestamp="2026-04-24 22:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:00:41.894634456 +0000 UTC m=+2665.862436529" watchObservedRunningTime="2026-04-24 22:00:41.896526239 +0000 UTC m=+2665.864328297" Apr 24 22:00:42.877372 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:42.877343 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:00:48.887519 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:00:48.887491 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:01:16.728594 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:16.728568 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:01:16.734345 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:16.734324 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:01:18.891320 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:18.891295 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:01:25.625944 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.625910 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p"] Apr 24 22:01:25.626400 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.626256 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kserve-container" containerID="cri-o://8835817856b280c7c14f64f0605420125d77a4c2340d0dc3fe7b492f8f689729" gracePeriod=30 Apr 24 22:01:25.626400 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.626278 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kube-rbac-proxy" containerID="cri-o://1f3e7fbbbdea0953c02f2722ecd1bf335f70bdf83603da2ce8c7de03493d8f6f" gracePeriod=30 Apr 24 22:01:25.695176 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695145 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5"] Apr 24 22:01:25.695470 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695458 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kube-rbac-proxy" Apr 24 22:01:25.695520 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695472 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kube-rbac-proxy" Apr 24 22:01:25.695520 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695496 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="storage-initializer" Apr 24 22:01:25.695520 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695503 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="storage-initializer" Apr 24 22:01:25.695520 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695509 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kserve-container" Apr 24 22:01:25.695520 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695514 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kserve-container" Apr 24 22:01:25.695689 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695567 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kserve-container" Apr 24 22:01:25.695689 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.695575 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="21c615f8-7088-4a30-b75d-3052e78aa031" containerName="kube-rbac-proxy" Apr 24 22:01:25.703310 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.703286 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.706848 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.706826 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 24 22:01:25.706946 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.706885 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:01:25.713595 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.713567 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5"] Apr 24 22:01:25.803993 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.803963 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbtkq\" (UniqueName: \"kubernetes.io/projected/d93fd8d0-0cea-4c2e-a676-6801b108163a-kube-api-access-gbtkq\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.804108 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.804028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.804108 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.804092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d93fd8d0-0cea-4c2e-a676-6801b108163a-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.804207 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.804128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d93fd8d0-0cea-4c2e-a676-6801b108163a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.904530 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.904506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.904621 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.904548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d93fd8d0-0cea-4c2e-a676-6801b108163a-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.904711 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:01:25.904632 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 22:01:25.904711 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.904658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d93fd8d0-0cea-4c2e-a676-6801b108163a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.904711 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:01:25.904695 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls podName:d93fd8d0-0cea-4c2e-a676-6801b108163a nodeName:}" failed. No retries permitted until 2026-04-24 22:01:26.404679615 +0000 UTC m=+2710.372481658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls") pod "isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" (UID: "d93fd8d0-0cea-4c2e-a676-6801b108163a") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 22:01:25.904892 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.904733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbtkq\" (UniqueName: \"kubernetes.io/projected/d93fd8d0-0cea-4c2e-a676-6801b108163a-kube-api-access-gbtkq\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.904892 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.904862 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d93fd8d0-0cea-4c2e-a676-6801b108163a-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.905230 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.905212 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d93fd8d0-0cea-4c2e-a676-6801b108163a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:25.915434 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:25.915413 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbtkq\" (UniqueName: \"kubernetes.io/projected/d93fd8d0-0cea-4c2e-a676-6801b108163a-kube-api-access-gbtkq\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:26.009515 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:26.009494 2578 generic.go:358] "Generic (PLEG): container finished" podID="1d606011-6522-4a74-a075-07fd69dccd79" containerID="1f3e7fbbbdea0953c02f2722ecd1bf335f70bdf83603da2ce8c7de03493d8f6f" exitCode=2 Apr 24 22:01:26.009604 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:26.009550 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" event={"ID":"1d606011-6522-4a74-a075-07fd69dccd79","Type":"ContainerDied","Data":"1f3e7fbbbdea0953c02f2722ecd1bf335f70bdf83603da2ce8c7de03493d8f6f"} Apr 24 22:01:26.407763 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:26.407723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:26.410236 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:26.410215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls\") pod \"isvc-sklearn-v2-predictor-f9cd8f646-pq5t5\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:26.614499 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:26.614469 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:26.738167 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:26.738143 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5"] Apr 24 22:01:26.740349 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:01:26.740320 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd93fd8d0_0cea_4c2e_a676_6801b108163a.slice/crio-a6dbb96061822b81b12023e3d5b54c950f589d10ae6cd2553c529ef20bf0bbce WatchSource:0}: Error finding container a6dbb96061822b81b12023e3d5b54c950f589d10ae6cd2553c529ef20bf0bbce: Status 404 returned error can't find the container with id a6dbb96061822b81b12023e3d5b54c950f589d10ae6cd2553c529ef20bf0bbce Apr 24 22:01:27.014722 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:27.014636 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" event={"ID":"d93fd8d0-0cea-4c2e-a676-6801b108163a","Type":"ContainerStarted","Data":"8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32"} Apr 24 22:01:27.014722 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:27.014679 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" event={"ID":"d93fd8d0-0cea-4c2e-a676-6801b108163a","Type":"ContainerStarted","Data":"a6dbb96061822b81b12023e3d5b54c950f589d10ae6cd2553c529ef20bf0bbce"} Apr 24 22:01:28.883611 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:28.883574 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 24 22:01:28.889008 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:28.888981 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.53:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:01:31.030948 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:31.030917 2578 generic.go:358] "Generic (PLEG): container finished" podID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerID="8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32" exitCode=0 Apr 24 22:01:31.031303 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:31.030996 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" event={"ID":"d93fd8d0-0cea-4c2e-a676-6801b108163a","Type":"ContainerDied","Data":"8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32"} Apr 24 22:01:32.035804 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.035774 2578 generic.go:358] "Generic (PLEG): container finished" podID="1d606011-6522-4a74-a075-07fd69dccd79" containerID="8835817856b280c7c14f64f0605420125d77a4c2340d0dc3fe7b492f8f689729" exitCode=0 Apr 24 22:01:32.036143 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.035781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" event={"ID":"1d606011-6522-4a74-a075-07fd69dccd79","Type":"ContainerDied","Data":"8835817856b280c7c14f64f0605420125d77a4c2340d0dc3fe7b492f8f689729"} Apr 24 22:01:32.036143 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.035886 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" event={"ID":"1d606011-6522-4a74-a075-07fd69dccd79","Type":"ContainerDied","Data":"63ca53a08b05345303119c721c5afb079c9d1e62df973df091899890070869a4"} Apr 24 22:01:32.036143 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.035904 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63ca53a08b05345303119c721c5afb079c9d1e62df973df091899890070869a4" Apr 24 22:01:32.037847 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.037821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" event={"ID":"d93fd8d0-0cea-4c2e-a676-6801b108163a","Type":"ContainerStarted","Data":"163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a"} Apr 24 22:01:32.037847 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.037848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" event={"ID":"d93fd8d0-0cea-4c2e-a676-6801b108163a","Type":"ContainerStarted","Data":"630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b"} Apr 24 22:01:32.038037 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.038020 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:32.039689 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.039673 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:01:32.075345 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.074942 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podStartSLOduration=7.074925705 podStartE2EDuration="7.074925705s" podCreationTimestamp="2026-04-24 22:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:01:32.072502961 +0000 UTC m=+2716.040305037" watchObservedRunningTime="2026-04-24 22:01:32.074925705 +0000 UTC m=+2716.042727768" Apr 24 22:01:32.153013 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.152989 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69rcr\" (UniqueName: \"kubernetes.io/projected/1d606011-6522-4a74-a075-07fd69dccd79-kube-api-access-69rcr\") pod \"1d606011-6522-4a74-a075-07fd69dccd79\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " Apr 24 22:01:32.153139 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.153024 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d606011-6522-4a74-a075-07fd69dccd79-kserve-provision-location\") pod \"1d606011-6522-4a74-a075-07fd69dccd79\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " Apr 24 22:01:32.153139 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.153078 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d606011-6522-4a74-a075-07fd69dccd79-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"1d606011-6522-4a74-a075-07fd69dccd79\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " Apr 24 22:01:32.153139 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.153113 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d606011-6522-4a74-a075-07fd69dccd79-proxy-tls\") pod \"1d606011-6522-4a74-a075-07fd69dccd79\" (UID: \"1d606011-6522-4a74-a075-07fd69dccd79\") " Apr 24 22:01:32.153385 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.153353 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d606011-6522-4a74-a075-07fd69dccd79-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1d606011-6522-4a74-a075-07fd69dccd79" (UID: "1d606011-6522-4a74-a075-07fd69dccd79"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:32.153454 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.153391 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d606011-6522-4a74-a075-07fd69dccd79-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "1d606011-6522-4a74-a075-07fd69dccd79" (UID: "1d606011-6522-4a74-a075-07fd69dccd79"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:01:32.153454 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.153423 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d606011-6522-4a74-a075-07fd69dccd79-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:01:32.155221 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.155195 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d606011-6522-4a74-a075-07fd69dccd79-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1d606011-6522-4a74-a075-07fd69dccd79" (UID: "1d606011-6522-4a74-a075-07fd69dccd79"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:01:32.155318 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.155246 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d606011-6522-4a74-a075-07fd69dccd79-kube-api-access-69rcr" (OuterVolumeSpecName: "kube-api-access-69rcr") pod "1d606011-6522-4a74-a075-07fd69dccd79" (UID: "1d606011-6522-4a74-a075-07fd69dccd79"). InnerVolumeSpecName "kube-api-access-69rcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:01:32.253997 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.253973 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-69rcr\" (UniqueName: \"kubernetes.io/projected/1d606011-6522-4a74-a075-07fd69dccd79-kube-api-access-69rcr\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:01:32.253997 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.253996 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d606011-6522-4a74-a075-07fd69dccd79-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:01:32.254118 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:32.254007 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d606011-6522-4a74-a075-07fd69dccd79-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:01:33.040607 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:33.040540 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p" Apr 24 22:01:33.041028 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:33.041012 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:33.041894 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:33.041863 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:01:33.059198 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:33.059171 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p"] Apr 24 22:01:33.065323 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:33.065303 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8rd8p"] Apr 24 22:01:34.043673 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:34.043633 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:01:34.659540 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:34.659507 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d606011-6522-4a74-a075-07fd69dccd79" path="/var/lib/kubelet/pods/1d606011-6522-4a74-a075-07fd69dccd79/volumes" Apr 24 22:01:39.048854 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:39.048826 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:01:39.049466 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:39.049436 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:01:49.050149 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:49.050109 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:01:59.049829 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:01:59.049785 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:02:09.049466 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:09.049429 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:02:19.049833 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:19.049793 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:02:29.050437 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:29.050370 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:02:35.853967 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.853929 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5"] Apr 24 22:02:35.854388 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.854341 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" containerID="cri-o://630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b" gracePeriod=30 Apr 24 22:02:35.854502 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.854382 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kube-rbac-proxy" containerID="cri-o://163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a" gracePeriod=30 Apr 24 22:02:35.934925 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.934893 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv"] Apr 24 22:02:35.935232 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.935220 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kube-rbac-proxy" Apr 24 22:02:35.935273 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.935233 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kube-rbac-proxy" Apr 24 22:02:35.935273 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.935242 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kserve-container" Apr 24 22:02:35.935273 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.935248 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kserve-container" Apr 24 22:02:35.935273 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.935264 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="storage-initializer" Apr 24 22:02:35.935273 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.935269 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="storage-initializer" Apr 24 22:02:35.935432 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.935327 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kserve-container" Apr 24 22:02:35.935432 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.935336 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d606011-6522-4a74-a075-07fd69dccd79" containerName="kube-rbac-proxy" Apr 24 22:02:35.938794 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.938773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:35.941086 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.941062 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 24 22:02:35.941180 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.941155 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 24 22:02:35.947982 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:35.947963 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv"] Apr 24 22:02:36.005713 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.005685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e43b1f30-423a-44f6-93e5-b1943b4f5355-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.005836 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.005740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e43b1f30-423a-44f6-93e5-b1943b4f5355-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.005836 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.005821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rvxk\" (UniqueName: \"kubernetes.io/projected/e43b1f30-423a-44f6-93e5-b1943b4f5355-kube-api-access-2rvxk\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.005917 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.005864 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e43b1f30-423a-44f6-93e5-b1943b4f5355-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.106946 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.106888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e43b1f30-423a-44f6-93e5-b1943b4f5355-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.106946 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.106921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rvxk\" (UniqueName: \"kubernetes.io/projected/e43b1f30-423a-44f6-93e5-b1943b4f5355-kube-api-access-2rvxk\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.107089 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.106948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e43b1f30-423a-44f6-93e5-b1943b4f5355-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.107089 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.106981 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e43b1f30-423a-44f6-93e5-b1943b4f5355-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.107332 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.107316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e43b1f30-423a-44f6-93e5-b1943b4f5355-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.107493 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.107475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e43b1f30-423a-44f6-93e5-b1943b4f5355-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.109390 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.109366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e43b1f30-423a-44f6-93e5-b1943b4f5355-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.115217 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.115193 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rvxk\" (UniqueName: \"kubernetes.io/projected/e43b1f30-423a-44f6-93e5-b1943b4f5355-kube-api-access-2rvxk\") pod \"isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.223310 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.223282 2578 generic.go:358] "Generic (PLEG): container finished" podID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerID="163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a" exitCode=2 Apr 24 22:02:36.223437 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.223366 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" event={"ID":"d93fd8d0-0cea-4c2e-a676-6801b108163a","Type":"ContainerDied","Data":"163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a"} Apr 24 22:02:36.249900 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.249876 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:36.370484 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:36.370464 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv"] Apr 24 22:02:36.372638 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:02:36.372611 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode43b1f30_423a_44f6_93e5_b1943b4f5355.slice/crio-51a243e0e343ec895e3d865deaf458e6915628b5ce715a09e10ae2c0947da621 WatchSource:0}: Error finding container 51a243e0e343ec895e3d865deaf458e6915628b5ce715a09e10ae2c0947da621: Status 404 returned error can't find the container with id 51a243e0e343ec895e3d865deaf458e6915628b5ce715a09e10ae2c0947da621 Apr 24 22:02:37.228216 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:37.228179 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" event={"ID":"e43b1f30-423a-44f6-93e5-b1943b4f5355","Type":"ContainerStarted","Data":"8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f"} Apr 24 22:02:37.228216 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:37.228217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" event={"ID":"e43b1f30-423a-44f6-93e5-b1943b4f5355","Type":"ContainerStarted","Data":"51a243e0e343ec895e3d865deaf458e6915628b5ce715a09e10ae2c0947da621"} Apr 24 22:02:39.044481 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.044441 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 24 22:02:39.049968 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.049944 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:02:39.601468 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.601442 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:02:39.736039 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.735976 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbtkq\" (UniqueName: \"kubernetes.io/projected/d93fd8d0-0cea-4c2e-a676-6801b108163a-kube-api-access-gbtkq\") pod \"d93fd8d0-0cea-4c2e-a676-6801b108163a\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " Apr 24 22:02:39.736039 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.736026 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d93fd8d0-0cea-4c2e-a676-6801b108163a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"d93fd8d0-0cea-4c2e-a676-6801b108163a\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " Apr 24 22:02:39.736200 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.736046 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls\") pod \"d93fd8d0-0cea-4c2e-a676-6801b108163a\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " Apr 24 22:02:39.736200 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.736067 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d93fd8d0-0cea-4c2e-a676-6801b108163a-kserve-provision-location\") pod \"d93fd8d0-0cea-4c2e-a676-6801b108163a\" (UID: \"d93fd8d0-0cea-4c2e-a676-6801b108163a\") " Apr 24 22:02:39.736410 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.736370 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d93fd8d0-0cea-4c2e-a676-6801b108163a-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "d93fd8d0-0cea-4c2e-a676-6801b108163a" (UID: "d93fd8d0-0cea-4c2e-a676-6801b108163a"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:02:39.736410 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.736385 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d93fd8d0-0cea-4c2e-a676-6801b108163a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d93fd8d0-0cea-4c2e-a676-6801b108163a" (UID: "d93fd8d0-0cea-4c2e-a676-6801b108163a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:02:39.738184 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.738163 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93fd8d0-0cea-4c2e-a676-6801b108163a-kube-api-access-gbtkq" (OuterVolumeSpecName: "kube-api-access-gbtkq") pod "d93fd8d0-0cea-4c2e-a676-6801b108163a" (UID: "d93fd8d0-0cea-4c2e-a676-6801b108163a"). InnerVolumeSpecName "kube-api-access-gbtkq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:02:39.738257 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.738204 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d93fd8d0-0cea-4c2e-a676-6801b108163a" (UID: "d93fd8d0-0cea-4c2e-a676-6801b108163a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:02:39.837375 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.837350 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d93fd8d0-0cea-4c2e-a676-6801b108163a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:02:39.837375 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.837372 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d93fd8d0-0cea-4c2e-a676-6801b108163a-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:02:39.837497 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.837383 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d93fd8d0-0cea-4c2e-a676-6801b108163a-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:02:39.837497 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:39.837398 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbtkq\" (UniqueName: \"kubernetes.io/projected/d93fd8d0-0cea-4c2e-a676-6801b108163a-kube-api-access-gbtkq\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:02:40.239415 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.239387 2578 generic.go:358] "Generic (PLEG): container finished" podID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerID="8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f" exitCode=0 Apr 24 22:02:40.239802 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.239461 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" event={"ID":"e43b1f30-423a-44f6-93e5-b1943b4f5355","Type":"ContainerDied","Data":"8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f"} Apr 24 22:02:40.241208 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.241186 2578 generic.go:358] "Generic (PLEG): container finished" podID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerID="630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b" exitCode=0 Apr 24 22:02:40.241323 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.241232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" event={"ID":"d93fd8d0-0cea-4c2e-a676-6801b108163a","Type":"ContainerDied","Data":"630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b"} Apr 24 22:02:40.241323 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.241249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" event={"ID":"d93fd8d0-0cea-4c2e-a676-6801b108163a","Type":"ContainerDied","Data":"a6dbb96061822b81b12023e3d5b54c950f589d10ae6cd2553c529ef20bf0bbce"} Apr 24 22:02:40.241323 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.241264 2578 scope.go:117] "RemoveContainer" containerID="163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a" Apr 24 22:02:40.241323 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.241269 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5" Apr 24 22:02:40.249340 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.249322 2578 scope.go:117] "RemoveContainer" containerID="630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b" Apr 24 22:02:40.256312 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.256268 2578 scope.go:117] "RemoveContainer" containerID="8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32" Apr 24 22:02:40.264944 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.264923 2578 scope.go:117] "RemoveContainer" containerID="163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a" Apr 24 22:02:40.265234 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:02:40.265207 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a\": container with ID starting with 163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a not found: ID does not exist" containerID="163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a" Apr 24 22:02:40.265322 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.265246 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a"} err="failed to get container status \"163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a\": rpc error: code = NotFound desc = could not find container \"163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a\": container with ID starting with 163f3852fbad017046c8026ddb6e69d240b2d7d930299ec5516a352cc4fdab9a not found: ID does not exist" Apr 24 22:02:40.265322 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.265273 2578 scope.go:117] "RemoveContainer" containerID="630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b" Apr 24 22:02:40.265558 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:02:40.265538 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b\": container with ID starting with 630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b not found: ID does not exist" containerID="630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b" Apr 24 22:02:40.265624 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.265568 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b"} err="failed to get container status \"630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b\": rpc error: code = NotFound desc = could not find container \"630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b\": container with ID starting with 630a373dcc2bebd6227cacfb93cbff37951f1651651cadd45b0b3827c56fcc8b not found: ID does not exist" Apr 24 22:02:40.265624 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.265590 2578 scope.go:117] "RemoveContainer" containerID="8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32" Apr 24 22:02:40.265851 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:02:40.265834 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32\": container with ID starting with 8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32 not found: ID does not exist" containerID="8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32" Apr 24 22:02:40.265924 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.265856 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32"} err="failed to get container status \"8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32\": rpc error: code = NotFound desc = could not find container \"8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32\": container with ID starting with 8d29f95432f3eec0d1b0ab876a0741ffbc999556c2c3d566c4d3225b63b63e32 not found: ID does not exist" Apr 24 22:02:40.269898 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.269876 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5"] Apr 24 22:02:40.273499 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.273479 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-f9cd8f646-pq5t5"] Apr 24 22:02:40.660015 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:40.659984 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" path="/var/lib/kubelet/pods/d93fd8d0-0cea-4c2e-a676-6801b108163a/volumes" Apr 24 22:02:41.246324 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:41.246284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" event={"ID":"e43b1f30-423a-44f6-93e5-b1943b4f5355","Type":"ContainerStarted","Data":"ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af"} Apr 24 22:02:41.246324 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:41.246328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" event={"ID":"e43b1f30-423a-44f6-93e5-b1943b4f5355","Type":"ContainerStarted","Data":"3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0"} Apr 24 22:02:41.246866 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:41.246554 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:41.266474 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:41.266396 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podStartSLOduration=6.2663837749999995 podStartE2EDuration="6.266383775s" podCreationTimestamp="2026-04-24 22:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:02:41.265076784 +0000 UTC m=+2785.232878845" watchObservedRunningTime="2026-04-24 22:02:41.266383775 +0000 UTC m=+2785.234185836" Apr 24 22:02:42.250681 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:42.250652 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:42.251878 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:42.251853 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:02:43.253911 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:43.253873 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:02:48.259042 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:48.259015 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:02:48.259607 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:48.259579 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:02:58.259566 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:02:58.259528 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:03:08.259684 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:08.259647 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:03:18.260201 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:18.260163 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:03:28.260277 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:28.260239 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:03:38.259684 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:38.259642 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:03:48.260892 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:48.260862 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:03:56.018423 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:56.018340 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv"] Apr 24 22:03:56.018881 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:56.018706 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" containerID="cri-o://3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0" gracePeriod=30 Apr 24 22:03:56.018881 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:56.018728 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kube-rbac-proxy" containerID="cri-o://ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af" gracePeriod=30 Apr 24 22:03:56.476221 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:56.476188 2578 generic.go:358] "Generic (PLEG): container finished" podID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerID="ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af" exitCode=2 Apr 24 22:03:56.476384 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:56.476245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" event={"ID":"e43b1f30-423a-44f6-93e5-b1943b4f5355","Type":"ContainerDied","Data":"ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af"} Apr 24 22:03:58.254570 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:58.254528 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.55:8643/healthz\": dial tcp 10.134.0.55:8643: connect: connection refused" Apr 24 22:03:58.259525 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:58.259496 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:03:59.655688 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.655669 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:03:59.733608 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.733545 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e43b1f30-423a-44f6-93e5-b1943b4f5355-kserve-provision-location\") pod \"e43b1f30-423a-44f6-93e5-b1943b4f5355\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " Apr 24 22:03:59.733608 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.733573 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e43b1f30-423a-44f6-93e5-b1943b4f5355-proxy-tls\") pod \"e43b1f30-423a-44f6-93e5-b1943b4f5355\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " Apr 24 22:03:59.733785 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.733620 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e43b1f30-423a-44f6-93e5-b1943b4f5355-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"e43b1f30-423a-44f6-93e5-b1943b4f5355\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " Apr 24 22:03:59.733785 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.733678 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rvxk\" (UniqueName: \"kubernetes.io/projected/e43b1f30-423a-44f6-93e5-b1943b4f5355-kube-api-access-2rvxk\") pod \"e43b1f30-423a-44f6-93e5-b1943b4f5355\" (UID: \"e43b1f30-423a-44f6-93e5-b1943b4f5355\") " Apr 24 22:03:59.733979 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.733951 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e43b1f30-423a-44f6-93e5-b1943b4f5355-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e43b1f30-423a-44f6-93e5-b1943b4f5355" (UID: "e43b1f30-423a-44f6-93e5-b1943b4f5355"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:03:59.734068 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.734044 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e43b1f30-423a-44f6-93e5-b1943b4f5355-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "e43b1f30-423a-44f6-93e5-b1943b4f5355" (UID: "e43b1f30-423a-44f6-93e5-b1943b4f5355"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:03:59.735785 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.735764 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43b1f30-423a-44f6-93e5-b1943b4f5355-kube-api-access-2rvxk" (OuterVolumeSpecName: "kube-api-access-2rvxk") pod "e43b1f30-423a-44f6-93e5-b1943b4f5355" (UID: "e43b1f30-423a-44f6-93e5-b1943b4f5355"). InnerVolumeSpecName "kube-api-access-2rvxk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:03:59.735866 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.735846 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43b1f30-423a-44f6-93e5-b1943b4f5355-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e43b1f30-423a-44f6-93e5-b1943b4f5355" (UID: "e43b1f30-423a-44f6-93e5-b1943b4f5355"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:03:59.834907 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.834877 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e43b1f30-423a-44f6-93e5-b1943b4f5355-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:03:59.834907 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.834904 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2rvxk\" (UniqueName: \"kubernetes.io/projected/e43b1f30-423a-44f6-93e5-b1943b4f5355-kube-api-access-2rvxk\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:03:59.835037 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.834921 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e43b1f30-423a-44f6-93e5-b1943b4f5355-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:03:59.835037 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:03:59.834933 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e43b1f30-423a-44f6-93e5-b1943b4f5355-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:04:00.488968 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.488934 2578 generic.go:358] "Generic (PLEG): container finished" podID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerID="3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0" exitCode=0 Apr 24 22:04:00.489127 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.489028 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" Apr 24 22:04:00.489127 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.489024 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" event={"ID":"e43b1f30-423a-44f6-93e5-b1943b4f5355","Type":"ContainerDied","Data":"3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0"} Apr 24 22:04:00.489203 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.489139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv" event={"ID":"e43b1f30-423a-44f6-93e5-b1943b4f5355","Type":"ContainerDied","Data":"51a243e0e343ec895e3d865deaf458e6915628b5ce715a09e10ae2c0947da621"} Apr 24 22:04:00.489203 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.489156 2578 scope.go:117] "RemoveContainer" containerID="ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af" Apr 24 22:04:00.498379 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.498362 2578 scope.go:117] "RemoveContainer" containerID="3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0" Apr 24 22:04:00.505560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.505546 2578 scope.go:117] "RemoveContainer" containerID="8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f" Apr 24 22:04:00.510523 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.510499 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv"] Apr 24 22:04:00.513305 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.513287 2578 scope.go:117] "RemoveContainer" containerID="ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af" Apr 24 22:04:00.513567 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:04:00.513549 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af\": container with ID starting with ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af not found: ID does not exist" containerID="ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af" Apr 24 22:04:00.513645 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.513574 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af"} err="failed to get container status \"ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af\": rpc error: code = NotFound desc = could not find container \"ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af\": container with ID starting with ac9e6d64d207a0b6ff9cc3bf4b6f3d0983e45b4408b4abfc630b10a24c1281af not found: ID does not exist" Apr 24 22:04:00.513645 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.513589 2578 scope.go:117] "RemoveContainer" containerID="3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0" Apr 24 22:04:00.513866 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:04:00.513852 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0\": container with ID starting with 3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0 not found: ID does not exist" containerID="3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0" Apr 24 22:04:00.513917 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.513871 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0"} err="failed to get container status \"3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0\": rpc error: code = NotFound desc = could not find container \"3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0\": container with ID starting with 3ee0641aa2526437ee93f05b0d282e06b5265c64d14c2797affb5923b366cfa0 not found: ID does not exist" Apr 24 22:04:00.513917 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.513884 2578 scope.go:117] "RemoveContainer" containerID="8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f" Apr 24 22:04:00.514123 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:04:00.514106 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f\": container with ID starting with 8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f not found: ID does not exist" containerID="8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f" Apr 24 22:04:00.514171 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.514128 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f"} err="failed to get container status \"8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f\": rpc error: code = NotFound desc = could not find container \"8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f\": container with ID starting with 8ca0683464bfe6c8442fdec349470751967dc5f15ded04b7646b5a649e410d0f not found: ID does not exist" Apr 24 22:04:00.514367 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.514349 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-566cfc9859-kjnsv"] Apr 24 22:04:00.659678 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:04:00.659659 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" path="/var/lib/kubelet/pods/e43b1f30-423a-44f6-93e5-b1943b4f5355/volumes" Apr 24 22:06:16.750940 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:06:16.750911 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:06:16.757045 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:06:16.757022 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:07:16.870306 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:07:16.870239 2578 scope.go:117] "RemoveContainer" containerID="8835817856b280c7c14f64f0605420125d77a4c2340d0dc3fe7b492f8f689729" Apr 24 22:07:16.877825 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:07:16.877803 2578 scope.go:117] "RemoveContainer" containerID="1f3e7fbbbdea0953c02f2722ecd1bf335f70bdf83603da2ce8c7de03493d8f6f" Apr 24 22:07:16.884295 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:07:16.884277 2578 scope.go:117] "RemoveContainer" containerID="679c1b93d0429dd845aee6a60277b288ea9157e26b05321b31886c68e6509a89" Apr 24 22:09:10.153510 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153475 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8"] Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153802 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="storage-initializer" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153813 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="storage-initializer" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153820 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="storage-initializer" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153826 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="storage-initializer" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153840 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153846 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153859 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kube-rbac-proxy" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153865 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kube-rbac-proxy" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153871 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kube-rbac-proxy" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153876 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kube-rbac-proxy" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153883 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153887 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153937 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kserve-container" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153945 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d93fd8d0-0cea-4c2e-a676-6801b108163a" containerName="kube-rbac-proxy" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153952 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kube-rbac-proxy" Apr 24 22:09:10.156007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.153960 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e43b1f30-423a-44f6-93e5-b1943b4f5355" containerName="kserve-container" Apr 24 22:09:10.156969 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.156952 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.159308 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.159282 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 22:09:10.159436 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.159311 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 22:09:10.159436 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.159324 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 22:09:10.159436 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.159313 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:09:10.159970 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.159952 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:09:10.165623 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.165604 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8"] Apr 24 22:09:10.333214 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.333186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f18907a-ee4d-470e-bddd-c8ec8099dda0-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.333349 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.333228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f18907a-ee4d-470e-bddd-c8ec8099dda0-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.333349 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.333265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.333349 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.333305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbjgb\" (UniqueName: \"kubernetes.io/projected/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kube-api-access-qbjgb\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.434077 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.434013 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f18907a-ee4d-470e-bddd-c8ec8099dda0-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.434077 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.434070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.434277 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.434091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbjgb\" (UniqueName: \"kubernetes.io/projected/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kube-api-access-qbjgb\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.434277 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.434130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f18907a-ee4d-470e-bddd-c8ec8099dda0-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.434492 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.434474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.434716 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.434694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f18907a-ee4d-470e-bddd-c8ec8099dda0-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.436515 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.436495 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f18907a-ee4d-470e-bddd-c8ec8099dda0-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.442387 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.442363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbjgb\" (UniqueName: \"kubernetes.io/projected/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kube-api-access-qbjgb\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.468173 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.468150 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:10.585862 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.585842 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8"] Apr 24 22:09:10.588129 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:09:10.588102 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f18907a_ee4d_470e_bddd_c8ec8099dda0.slice/crio-86c780cee0929a8aa8294e9594bf47c1459ba6160a31a6a84b5426bbbc3993d9 WatchSource:0}: Error finding container 86c780cee0929a8aa8294e9594bf47c1459ba6160a31a6a84b5426bbbc3993d9: Status 404 returned error can't find the container with id 86c780cee0929a8aa8294e9594bf47c1459ba6160a31a6a84b5426bbbc3993d9 Apr 24 22:09:10.589924 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:10.589905 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:09:11.353864 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:11.353784 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" event={"ID":"8f18907a-ee4d-470e-bddd-c8ec8099dda0","Type":"ContainerStarted","Data":"7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb"} Apr 24 22:09:11.353864 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:11.353825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" event={"ID":"8f18907a-ee4d-470e-bddd-c8ec8099dda0","Type":"ContainerStarted","Data":"86c780cee0929a8aa8294e9594bf47c1459ba6160a31a6a84b5426bbbc3993d9"} Apr 24 22:09:15.366219 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:15.366184 2578 generic.go:358] "Generic (PLEG): container finished" podID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerID="7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb" exitCode=0 Apr 24 22:09:15.366581 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:15.366224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" event={"ID":"8f18907a-ee4d-470e-bddd-c8ec8099dda0","Type":"ContainerDied","Data":"7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb"} Apr 24 22:09:16.370913 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:16.370875 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" event={"ID":"8f18907a-ee4d-470e-bddd-c8ec8099dda0","Type":"ContainerStarted","Data":"57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3"} Apr 24 22:09:16.370913 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:16.370909 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" event={"ID":"8f18907a-ee4d-470e-bddd-c8ec8099dda0","Type":"ContainerStarted","Data":"2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa"} Apr 24 22:09:16.371404 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:16.371237 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:16.371404 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:16.371266 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:16.393487 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:16.393435 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" podStartSLOduration=6.393423173 podStartE2EDuration="6.393423173s" podCreationTimestamp="2026-04-24 22:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:09:16.391606253 +0000 UTC m=+3180.359408315" watchObservedRunningTime="2026-04-24 22:09:16.393423173 +0000 UTC m=+3180.361225235" Apr 24 22:09:22.379373 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:22.379346 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:09:52.383523 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:09:52.383493 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:10:00.237059 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.236973 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8"] Apr 24 22:10:00.237580 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.237375 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kserve-container" containerID="cri-o://2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa" gracePeriod=30 Apr 24 22:10:00.237580 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.237402 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kube-rbac-proxy" containerID="cri-o://57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3" gracePeriod=30 Apr 24 22:10:00.350825 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.350796 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd"] Apr 24 22:10:00.354220 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.354200 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.356958 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.356940 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 22:10:00.360962 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.360779 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 22:10:00.368255 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.368234 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd"] Apr 24 22:10:00.395531 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.395510 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c239bcbf-6b4f-492a-995e-87ca125f4596-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.395654 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.395541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c239bcbf-6b4f-492a-995e-87ca125f4596-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.395654 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.395572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c239bcbf-6b4f-492a-995e-87ca125f4596-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.395786 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.395697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx7xr\" (UniqueName: \"kubernetes.io/projected/c239bcbf-6b4f-492a-995e-87ca125f4596-kube-api-access-jx7xr\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.496356 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.496295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx7xr\" (UniqueName: \"kubernetes.io/projected/c239bcbf-6b4f-492a-995e-87ca125f4596-kube-api-access-jx7xr\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.496356 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.496348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c239bcbf-6b4f-492a-995e-87ca125f4596-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.496524 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.496369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c239bcbf-6b4f-492a-995e-87ca125f4596-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.496524 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.496403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c239bcbf-6b4f-492a-995e-87ca125f4596-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.496844 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.496821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c239bcbf-6b4f-492a-995e-87ca125f4596-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.497103 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.497084 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c239bcbf-6b4f-492a-995e-87ca125f4596-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.499130 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.499110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c239bcbf-6b4f-492a-995e-87ca125f4596-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.502352 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.502329 2578 generic.go:358] "Generic (PLEG): container finished" podID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerID="57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3" exitCode=2 Apr 24 22:10:00.502462 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.502378 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" event={"ID":"8f18907a-ee4d-470e-bddd-c8ec8099dda0","Type":"ContainerDied","Data":"57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3"} Apr 24 22:10:00.504439 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.504418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx7xr\" (UniqueName: \"kubernetes.io/projected/c239bcbf-6b4f-492a-995e-87ca125f4596-kube-api-access-jx7xr\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-bp5rd\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.664507 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.664480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:00.788519 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:00.788493 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd"] Apr 24 22:10:00.790543 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:10:00.790521 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc239bcbf_6b4f_492a_995e_87ca125f4596.slice/crio-c350759d6fd4ed6760189f3ded9e43e8617621493d4a8b71a2666e80856a154b WatchSource:0}: Error finding container c350759d6fd4ed6760189f3ded9e43e8617621493d4a8b71a2666e80856a154b: Status 404 returned error can't find the container with id c350759d6fd4ed6760189f3ded9e43e8617621493d4a8b71a2666e80856a154b Apr 24 22:10:01.506400 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:01.506366 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" event={"ID":"c239bcbf-6b4f-492a-995e-87ca125f4596","Type":"ContainerStarted","Data":"03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8"} Apr 24 22:10:01.506400 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:01.506402 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" event={"ID":"c239bcbf-6b4f-492a-995e-87ca125f4596","Type":"ContainerStarted","Data":"c350759d6fd4ed6760189f3ded9e43e8617621493d4a8b71a2666e80856a154b"} Apr 24 22:10:02.374464 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:02.374408 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.56:8643/healthz\": dial tcp 10.134.0.56:8643: connect: connection refused" Apr 24 22:10:05.518540 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.518503 2578 generic.go:358] "Generic (PLEG): container finished" podID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerID="03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8" exitCode=0 Apr 24 22:10:05.518917 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.518547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" event={"ID":"c239bcbf-6b4f-492a-995e-87ca125f4596","Type":"ContainerDied","Data":"03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8"} Apr 24 22:10:05.878212 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.878186 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:10:05.943834 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.943807 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f18907a-ee4d-470e-bddd-c8ec8099dda0-proxy-tls\") pod \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " Apr 24 22:10:05.943940 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.943886 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f18907a-ee4d-470e-bddd-c8ec8099dda0-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " Apr 24 22:10:05.943940 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.943913 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kserve-provision-location\") pod \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " Apr 24 22:10:05.943940 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.943935 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbjgb\" (UniqueName: \"kubernetes.io/projected/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kube-api-access-qbjgb\") pod \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\" (UID: \"8f18907a-ee4d-470e-bddd-c8ec8099dda0\") " Apr 24 22:10:05.944245 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.944219 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f18907a-ee4d-470e-bddd-c8ec8099dda0" (UID: "8f18907a-ee4d-470e-bddd-c8ec8099dda0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:05.944354 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.944268 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f18907a-ee4d-470e-bddd-c8ec8099dda0-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "8f18907a-ee4d-470e-bddd-c8ec8099dda0" (UID: "8f18907a-ee4d-470e-bddd-c8ec8099dda0"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:10:05.946137 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.946116 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f18907a-ee4d-470e-bddd-c8ec8099dda0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f18907a-ee4d-470e-bddd-c8ec8099dda0" (UID: "8f18907a-ee4d-470e-bddd-c8ec8099dda0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:10:05.946378 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:05.946358 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kube-api-access-qbjgb" (OuterVolumeSpecName: "kube-api-access-qbjgb") pod "8f18907a-ee4d-470e-bddd-c8ec8099dda0" (UID: "8f18907a-ee4d-470e-bddd-c8ec8099dda0"). InnerVolumeSpecName "kube-api-access-qbjgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:10:06.044779 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.044726 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f18907a-ee4d-470e-bddd-c8ec8099dda0-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:10:06.044779 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.044778 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:10:06.044917 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.044790 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbjgb\" (UniqueName: \"kubernetes.io/projected/8f18907a-ee4d-470e-bddd-c8ec8099dda0-kube-api-access-qbjgb\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:10:06.044917 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.044800 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f18907a-ee4d-470e-bddd-c8ec8099dda0-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:10:06.523096 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.523067 2578 generic.go:358] "Generic (PLEG): container finished" podID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerID="2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa" exitCode=0 Apr 24 22:10:06.523511 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.523153 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" Apr 24 22:10:06.523511 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.523151 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" event={"ID":"8f18907a-ee4d-470e-bddd-c8ec8099dda0","Type":"ContainerDied","Data":"2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa"} Apr 24 22:10:06.523511 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.523199 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8" event={"ID":"8f18907a-ee4d-470e-bddd-c8ec8099dda0","Type":"ContainerDied","Data":"86c780cee0929a8aa8294e9594bf47c1459ba6160a31a6a84b5426bbbc3993d9"} Apr 24 22:10:06.523511 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.523219 2578 scope.go:117] "RemoveContainer" containerID="57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3" Apr 24 22:10:06.525148 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.525126 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" event={"ID":"c239bcbf-6b4f-492a-995e-87ca125f4596","Type":"ContainerStarted","Data":"68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170"} Apr 24 22:10:06.525256 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.525159 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" event={"ID":"c239bcbf-6b4f-492a-995e-87ca125f4596","Type":"ContainerStarted","Data":"91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a"} Apr 24 22:10:06.525372 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.525358 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:06.535288 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.535222 2578 scope.go:117] "RemoveContainer" containerID="2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa" Apr 24 22:10:06.542496 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.542473 2578 scope.go:117] "RemoveContainer" containerID="7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb" Apr 24 22:10:06.549341 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.549323 2578 scope.go:117] "RemoveContainer" containerID="57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3" Apr 24 22:10:06.549590 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:10:06.549571 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3\": container with ID starting with 57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3 not found: ID does not exist" containerID="57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3" Apr 24 22:10:06.549660 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.549603 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3"} err="failed to get container status \"57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3\": rpc error: code = NotFound desc = could not find container \"57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3\": container with ID starting with 57b99f16d0bdde94aaa4c65d7c5318e579f7c3c4226bf5a45ff77519247227b3 not found: ID does not exist" Apr 24 22:10:06.549660 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.549626 2578 scope.go:117] "RemoveContainer" containerID="2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa" Apr 24 22:10:06.549890 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:10:06.549872 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa\": container with ID starting with 2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa not found: ID does not exist" containerID="2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa" Apr 24 22:10:06.549948 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.549895 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa"} err="failed to get container status \"2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa\": rpc error: code = NotFound desc = could not find container \"2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa\": container with ID starting with 2d58163ef7fcdf5d0a5d52e3c50c3b50080441b80ed1e873e5d9471de167f2aa not found: ID does not exist" Apr 24 22:10:06.549948 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.549912 2578 scope.go:117] "RemoveContainer" containerID="7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb" Apr 24 22:10:06.550115 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:10:06.550098 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb\": container with ID starting with 7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb not found: ID does not exist" containerID="7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb" Apr 24 22:10:06.550167 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.550124 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb"} err="failed to get container status \"7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb\": rpc error: code = NotFound desc = could not find container \"7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb\": container with ID starting with 7ca59253808ba360e5868051485a250b6b6a098fef8d69faa03e090da7a6f8eb not found: ID does not exist" Apr 24 22:10:06.558950 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.558910 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" podStartSLOduration=6.558899744 podStartE2EDuration="6.558899744s" podCreationTimestamp="2026-04-24 22:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:10:06.554980394 +0000 UTC m=+3230.522782455" watchObservedRunningTime="2026-04-24 22:10:06.558899744 +0000 UTC m=+3230.526701806" Apr 24 22:10:06.568512 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.568491 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8"] Apr 24 22:10:06.575176 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.575153 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-9kzj8"] Apr 24 22:10:06.661146 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:06.661125 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" path="/var/lib/kubelet/pods/8f18907a-ee4d-470e-bddd-c8ec8099dda0/volumes" Apr 24 22:10:07.529075 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:07.529043 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:13.537330 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:13.537303 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:43.541333 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:43.541307 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:50.473521 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:50.473485 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd"] Apr 24 22:10:50.473924 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:50.473795 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kserve-container" containerID="cri-o://91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a" gracePeriod=30 Apr 24 22:10:50.473924 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:50.473825 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kube-rbac-proxy" containerID="cri-o://68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170" gracePeriod=30 Apr 24 22:10:50.655950 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:50.655912 2578 generic.go:358] "Generic (PLEG): container finished" podID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerID="68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170" exitCode=2 Apr 24 22:10:50.658889 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:50.658861 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" event={"ID":"c239bcbf-6b4f-492a-995e-87ca125f4596","Type":"ContainerDied","Data":"68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170"} Apr 24 22:10:53.532951 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:53.532907 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.57:8643/healthz\": dial tcp 10.134.0.57:8643: connect: connection refused" Apr 24 22:10:53.538396 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:53.538369 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 24 22:10:55.520464 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.520444 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:55.609905 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.609843 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c239bcbf-6b4f-492a-995e-87ca125f4596-kserve-provision-location\") pod \"c239bcbf-6b4f-492a-995e-87ca125f4596\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " Apr 24 22:10:55.609905 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.609876 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx7xr\" (UniqueName: \"kubernetes.io/projected/c239bcbf-6b4f-492a-995e-87ca125f4596-kube-api-access-jx7xr\") pod \"c239bcbf-6b4f-492a-995e-87ca125f4596\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " Apr 24 22:10:55.610064 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.609924 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c239bcbf-6b4f-492a-995e-87ca125f4596-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"c239bcbf-6b4f-492a-995e-87ca125f4596\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " Apr 24 22:10:55.610064 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.609968 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c239bcbf-6b4f-492a-995e-87ca125f4596-proxy-tls\") pod \"c239bcbf-6b4f-492a-995e-87ca125f4596\" (UID: \"c239bcbf-6b4f-492a-995e-87ca125f4596\") " Apr 24 22:10:55.610194 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.610169 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c239bcbf-6b4f-492a-995e-87ca125f4596-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c239bcbf-6b4f-492a-995e-87ca125f4596" (UID: "c239bcbf-6b4f-492a-995e-87ca125f4596"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:55.610269 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.610252 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c239bcbf-6b4f-492a-995e-87ca125f4596-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "c239bcbf-6b4f-492a-995e-87ca125f4596" (UID: "c239bcbf-6b4f-492a-995e-87ca125f4596"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:10:55.612130 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.612104 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c239bcbf-6b4f-492a-995e-87ca125f4596-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c239bcbf-6b4f-492a-995e-87ca125f4596" (UID: "c239bcbf-6b4f-492a-995e-87ca125f4596"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:10:55.612130 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.612116 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c239bcbf-6b4f-492a-995e-87ca125f4596-kube-api-access-jx7xr" (OuterVolumeSpecName: "kube-api-access-jx7xr") pod "c239bcbf-6b4f-492a-995e-87ca125f4596" (UID: "c239bcbf-6b4f-492a-995e-87ca125f4596"). InnerVolumeSpecName "kube-api-access-jx7xr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:10:55.670523 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.670498 2578 generic.go:358] "Generic (PLEG): container finished" podID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerID="91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a" exitCode=0 Apr 24 22:10:55.670616 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.670563 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" event={"ID":"c239bcbf-6b4f-492a-995e-87ca125f4596","Type":"ContainerDied","Data":"91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a"} Apr 24 22:10:55.670616 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.670570 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" Apr 24 22:10:55.670616 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.670587 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd" event={"ID":"c239bcbf-6b4f-492a-995e-87ca125f4596","Type":"ContainerDied","Data":"c350759d6fd4ed6760189f3ded9e43e8617621493d4a8b71a2666e80856a154b"} Apr 24 22:10:55.670616 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.670602 2578 scope.go:117] "RemoveContainer" containerID="68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170" Apr 24 22:10:55.678525 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.678509 2578 scope.go:117] "RemoveContainer" containerID="91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a" Apr 24 22:10:55.685372 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.685354 2578 scope.go:117] "RemoveContainer" containerID="03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8" Apr 24 22:10:55.694863 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.694843 2578 scope.go:117] "RemoveContainer" containerID="68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170" Apr 24 22:10:55.695656 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:10:55.695592 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170\": container with ID starting with 68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170 not found: ID does not exist" containerID="68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170" Apr 24 22:10:55.695656 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.695631 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170"} err="failed to get container status \"68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170\": rpc error: code = NotFound desc = could not find container \"68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170\": container with ID starting with 68567a139451aea15bd329190cc61d7d8384beb519b0d50c54ff545a5424d170 not found: ID does not exist" Apr 24 22:10:55.695848 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.695655 2578 scope.go:117] "RemoveContainer" containerID="91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a" Apr 24 22:10:55.695990 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:10:55.695966 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a\": container with ID starting with 91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a not found: ID does not exist" containerID="91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a" Apr 24 22:10:55.696051 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.696000 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a"} err="failed to get container status \"91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a\": rpc error: code = NotFound desc = could not find container \"91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a\": container with ID starting with 91b50b7690790d476539497aa7edaaed90b8c9e9094b231d066f4c1692e50e6a not found: ID does not exist" Apr 24 22:10:55.696051 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.696022 2578 scope.go:117] "RemoveContainer" containerID="03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8" Apr 24 22:10:55.696380 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:10:55.696361 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8\": container with ID starting with 03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8 not found: ID does not exist" containerID="03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8" Apr 24 22:10:55.696499 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.696384 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8"} err="failed to get container status \"03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8\": rpc error: code = NotFound desc = could not find container \"03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8\": container with ID starting with 03013b8e9e966d14337a65621eb36fa41cc27cdc5ad805e62eb20a27a37888e8 not found: ID does not exist" Apr 24 22:10:55.697543 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.697522 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd"] Apr 24 22:10:55.698941 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.698920 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-bp5rd"] Apr 24 22:10:55.711237 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.711218 2578 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c239bcbf-6b4f-492a-995e-87ca125f4596-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:10:55.711320 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.711239 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c239bcbf-6b4f-492a-995e-87ca125f4596-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:10:55.711320 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.711255 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c239bcbf-6b4f-492a-995e-87ca125f4596-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:10:55.711320 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:55.711267 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jx7xr\" (UniqueName: \"kubernetes.io/projected/c239bcbf-6b4f-492a-995e-87ca125f4596-kube-api-access-jx7xr\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:10:56.660255 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:10:56.660223 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" path="/var/lib/kubelet/pods/c239bcbf-6b4f-492a-995e-87ca125f4596/volumes" Apr 24 22:11:16.772033 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:11:16.772005 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:11:16.780995 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:11:16.780974 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:12:10.651036 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651001 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg"] Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651315 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kube-rbac-proxy" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651327 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kube-rbac-proxy" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651344 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="storage-initializer" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651349 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="storage-initializer" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651356 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kserve-container" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651361 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kserve-container" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651366 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kserve-container" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651371 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kserve-container" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651379 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="storage-initializer" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651383 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="storage-initializer" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651390 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kube-rbac-proxy" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651394 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kube-rbac-proxy" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651441 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kube-rbac-proxy" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651448 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kube-rbac-proxy" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651455 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c239bcbf-6b4f-492a-995e-87ca125f4596" containerName="kserve-container" Apr 24 22:12:10.651560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.651463 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f18907a-ee4d-470e-bddd-c8ec8099dda0" containerName="kserve-container" Apr 24 22:12:10.654821 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.654805 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.657140 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.657114 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 24 22:12:10.657140 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.657130 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:12:10.657318 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.657176 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:12:10.657318 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.657217 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 22:12:10.657445 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.657428 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:12:10.665726 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.665706 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg"] Apr 24 22:12:10.729536 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.729497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.729632 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.729554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ed3a23c-5e03-480a-916a-94ef044eda73-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.729717 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.729643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ed3a23c-5e03-480a-916a-94ef044eda73-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.729717 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.729674 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk6sg\" (UniqueName: \"kubernetes.io/projected/7ed3a23c-5e03-480a-916a-94ef044eda73-kube-api-access-gk6sg\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.830163 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.830139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ed3a23c-5e03-480a-916a-94ef044eda73-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.830254 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.830167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk6sg\" (UniqueName: \"kubernetes.io/projected/7ed3a23c-5e03-480a-916a-94ef044eda73-kube-api-access-gk6sg\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.830254 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.830203 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.830337 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:12:10.830285 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-serving-cert: secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 24 22:12:10.830337 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.830314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ed3a23c-5e03-480a-916a-94ef044eda73-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.830337 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:12:10.830335 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls podName:7ed3a23c-5e03-480a-916a-94ef044eda73 nodeName:}" failed. No retries permitted until 2026-04-24 22:12:11.330320587 +0000 UTC m=+3355.298122627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls") pod "isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" (UID: "7ed3a23c-5e03-480a-916a-94ef044eda73") : secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 24 22:12:10.830665 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.830645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ed3a23c-5e03-480a-916a-94ef044eda73-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.830803 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.830784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ed3a23c-5e03-480a-916a-94ef044eda73-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:10.838873 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:10.838855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk6sg\" (UniqueName: \"kubernetes.io/projected/7ed3a23c-5e03-480a-916a-94ef044eda73-kube-api-access-gk6sg\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:11.335297 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:11.335269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:11.335429 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:12:11.335392 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-serving-cert: secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 24 22:12:11.335472 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:12:11.335459 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls podName:7ed3a23c-5e03-480a-916a-94ef044eda73 nodeName:}" failed. No retries permitted until 2026-04-24 22:12:12.335445595 +0000 UTC m=+3356.303247635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls") pod "isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" (UID: "7ed3a23c-5e03-480a-916a-94ef044eda73") : secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 24 22:12:12.344440 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:12.344412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:12.346905 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:12.346882 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:12.465700 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:12.465674 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:12.584686 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:12.584662 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg"] Apr 24 22:12:12.586876 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:12:12.586845 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed3a23c_5e03_480a_916a_94ef044eda73.slice/crio-4debdc2bcbec0b63f8d699ee7dad3087966ba743e5cfc27921a2faec883f168f WatchSource:0}: Error finding container 4debdc2bcbec0b63f8d699ee7dad3087966ba743e5cfc27921a2faec883f168f: Status 404 returned error can't find the container with id 4debdc2bcbec0b63f8d699ee7dad3087966ba743e5cfc27921a2faec883f168f Apr 24 22:12:12.876566 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:12.876486 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" event={"ID":"7ed3a23c-5e03-480a-916a-94ef044eda73","Type":"ContainerStarted","Data":"beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183"} Apr 24 22:12:12.876566 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:12.876521 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" event={"ID":"7ed3a23c-5e03-480a-916a-94ef044eda73","Type":"ContainerStarted","Data":"4debdc2bcbec0b63f8d699ee7dad3087966ba743e5cfc27921a2faec883f168f"} Apr 24 22:12:16.887970 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:16.887892 2578 generic.go:358] "Generic (PLEG): container finished" podID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerID="beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183" exitCode=0 Apr 24 22:12:16.888532 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:16.887978 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" event={"ID":"7ed3a23c-5e03-480a-916a-94ef044eda73","Type":"ContainerDied","Data":"beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183"} Apr 24 22:12:17.892420 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:17.892383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" event={"ID":"7ed3a23c-5e03-480a-916a-94ef044eda73","Type":"ContainerStarted","Data":"76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58"} Apr 24 22:12:17.892420 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:17.892424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" event={"ID":"7ed3a23c-5e03-480a-916a-94ef044eda73","Type":"ContainerStarted","Data":"2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe"} Apr 24 22:12:17.892955 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:17.892643 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:17.919438 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:17.919393 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" podStartSLOduration=7.91938143 podStartE2EDuration="7.91938143s" podCreationTimestamp="2026-04-24 22:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:12:17.91752453 +0000 UTC m=+3361.885326592" watchObservedRunningTime="2026-04-24 22:12:17.91938143 +0000 UTC m=+3361.887183492" Apr 24 22:12:18.895049 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:18.895019 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:24.903039 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:24.903010 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:12:54.907022 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:12:54.906943 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:13:00.759096 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:00.759060 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg"] Apr 24 22:13:00.759535 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:00.759386 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kserve-container" containerID="cri-o://2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe" gracePeriod=30 Apr 24 22:13:00.759535 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:00.759436 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kube-rbac-proxy" containerID="cri-o://76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58" gracePeriod=30 Apr 24 22:13:01.014337 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:01.014260 2578 generic.go:358] "Generic (PLEG): container finished" podID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerID="76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58" exitCode=2 Apr 24 22:13:01.014471 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:01.014331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" event={"ID":"7ed3a23c-5e03-480a-916a-94ef044eda73","Type":"ContainerDied","Data":"76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58"} Apr 24 22:13:04.899429 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:04.899392 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.58:8643/healthz\": dial tcp 10.134.0.58:8643: connect: connection refused" Apr 24 22:13:04.903497 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:04.903463 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:13:06.995922 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:06.995901 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:13:07.033480 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.033454 2578 generic.go:358] "Generic (PLEG): container finished" podID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerID="2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe" exitCode=0 Apr 24 22:13:07.033598 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.033542 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" Apr 24 22:13:07.033676 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.033536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" event={"ID":"7ed3a23c-5e03-480a-916a-94ef044eda73","Type":"ContainerDied","Data":"2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe"} Apr 24 22:13:07.033676 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.033662 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg" event={"ID":"7ed3a23c-5e03-480a-916a-94ef044eda73","Type":"ContainerDied","Data":"4debdc2bcbec0b63f8d699ee7dad3087966ba743e5cfc27921a2faec883f168f"} Apr 24 22:13:07.033809 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.033681 2578 scope.go:117] "RemoveContainer" containerID="76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58" Apr 24 22:13:07.041596 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.041575 2578 scope.go:117] "RemoveContainer" containerID="2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe" Apr 24 22:13:07.048547 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.048531 2578 scope.go:117] "RemoveContainer" containerID="beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183" Apr 24 22:13:07.054896 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.054880 2578 scope.go:117] "RemoveContainer" containerID="76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58" Apr 24 22:13:07.055133 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:13:07.055115 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58\": container with ID starting with 76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58 not found: ID does not exist" containerID="76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58" Apr 24 22:13:07.055189 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.055141 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58"} err="failed to get container status \"76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58\": rpc error: code = NotFound desc = could not find container \"76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58\": container with ID starting with 76cf467d62a56f27a98251d683a14f411c0d7498ae06e1be621d307da83edf58 not found: ID does not exist" Apr 24 22:13:07.055189 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.055159 2578 scope.go:117] "RemoveContainer" containerID="2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe" Apr 24 22:13:07.055397 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:13:07.055383 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe\": container with ID starting with 2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe not found: ID does not exist" containerID="2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe" Apr 24 22:13:07.055446 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.055400 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe"} err="failed to get container status \"2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe\": rpc error: code = NotFound desc = could not find container \"2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe\": container with ID starting with 2274c6cbaa631f9381cbeeb2d574a5acfab1e9f2e09662d8c7868bf7396c83fe not found: ID does not exist" Apr 24 22:13:07.055446 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.055412 2578 scope.go:117] "RemoveContainer" containerID="beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183" Apr 24 22:13:07.055591 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:13:07.055574 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183\": container with ID starting with beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183 not found: ID does not exist" containerID="beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183" Apr 24 22:13:07.055641 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.055597 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183"} err="failed to get container status \"beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183\": rpc error: code = NotFound desc = could not find container \"beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183\": container with ID starting with beec2ed519ec62ca43d6ba2300ebf67f8b5a469373a68c9147f1e4da8e7f1183 not found: ID does not exist" Apr 24 22:13:07.149453 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.149435 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ed3a23c-5e03-480a-916a-94ef044eda73-kserve-provision-location\") pod \"7ed3a23c-5e03-480a-916a-94ef044eda73\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " Apr 24 22:13:07.149529 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.149461 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk6sg\" (UniqueName: \"kubernetes.io/projected/7ed3a23c-5e03-480a-916a-94ef044eda73-kube-api-access-gk6sg\") pod \"7ed3a23c-5e03-480a-916a-94ef044eda73\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " Apr 24 22:13:07.149529 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.149485 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ed3a23c-5e03-480a-916a-94ef044eda73-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"7ed3a23c-5e03-480a-916a-94ef044eda73\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " Apr 24 22:13:07.149601 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.149567 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls\") pod \"7ed3a23c-5e03-480a-916a-94ef044eda73\" (UID: \"7ed3a23c-5e03-480a-916a-94ef044eda73\") " Apr 24 22:13:07.149791 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.149765 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed3a23c-5e03-480a-916a-94ef044eda73-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7ed3a23c-5e03-480a-916a-94ef044eda73" (UID: "7ed3a23c-5e03-480a-916a-94ef044eda73"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:13:07.149870 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.149856 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ed3a23c-5e03-480a-916a-94ef044eda73-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "7ed3a23c-5e03-480a-916a-94ef044eda73" (UID: "7ed3a23c-5e03-480a-916a-94ef044eda73"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:13:07.151605 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.151586 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed3a23c-5e03-480a-916a-94ef044eda73-kube-api-access-gk6sg" (OuterVolumeSpecName: "kube-api-access-gk6sg") pod "7ed3a23c-5e03-480a-916a-94ef044eda73" (UID: "7ed3a23c-5e03-480a-916a-94ef044eda73"). InnerVolumeSpecName "kube-api-access-gk6sg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:13:07.151659 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.151585 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7ed3a23c-5e03-480a-916a-94ef044eda73" (UID: "7ed3a23c-5e03-480a-916a-94ef044eda73"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:13:07.250664 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.250640 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ed3a23c-5e03-480a-916a-94ef044eda73-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:13:07.250664 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.250661 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ed3a23c-5e03-480a-916a-94ef044eda73-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:13:07.250825 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.250671 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gk6sg\" (UniqueName: \"kubernetes.io/projected/7ed3a23c-5e03-480a-916a-94ef044eda73-kube-api-access-gk6sg\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:13:07.250825 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.250681 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ed3a23c-5e03-480a-916a-94ef044eda73-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:13:07.356238 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.356210 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg"] Apr 24 22:13:07.359923 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:07.359903 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-cksmg"] Apr 24 22:13:08.659285 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:13:08.659258 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" path="/var/lib/kubelet/pods/7ed3a23c-5e03-480a-916a-94ef044eda73/volumes" Apr 24 22:14:20.997685 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.997651 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45"] Apr 24 22:14:21.000204 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.998140 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kserve-container" Apr 24 22:14:21.000204 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.998156 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kserve-container" Apr 24 22:14:21.000204 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.998172 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kube-rbac-proxy" Apr 24 22:14:21.000204 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.998178 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kube-rbac-proxy" Apr 24 22:14:21.000204 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.998189 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="storage-initializer" Apr 24 22:14:21.000204 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.998196 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="storage-initializer" Apr 24 22:14:21.000204 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.998249 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kserve-container" Apr 24 22:14:21.000204 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:20.998262 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ed3a23c-5e03-480a-916a-94ef044eda73" containerName="kube-rbac-proxy" Apr 24 22:14:21.001221 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.001204 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.003415 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.003392 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 24 22:14:21.003546 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.003419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 22:14:21.003546 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.003507 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 22:14:21.003546 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.003516 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:14:21.003733 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.003609 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 24 22:14:21.004275 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.004261 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:14:21.010577 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.010552 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45"] Apr 24 22:14:21.077434 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.077409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k29qd\" (UniqueName: \"kubernetes.io/projected/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kube-api-access-k29qd\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.077555 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.077458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.077555 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.077476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.077660 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.077576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.178140 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.178113 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.178140 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.178141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.178305 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.178162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.178305 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.178198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k29qd\" (UniqueName: \"kubernetes.io/projected/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kube-api-access-k29qd\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.178305 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:14:21.178276 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 24 22:14:21.178479 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:14:21.178358 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls podName:89afa2b2-7c81-4027-a3e9-2919a9b0b54e nodeName:}" failed. No retries permitted until 2026-04-24 22:14:21.678322755 +0000 UTC m=+3485.646124800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls") pod "isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" (UID: "89afa2b2-7c81-4027-a3e9-2919a9b0b54e") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 24 22:14:21.178556 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.178509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.178845 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.178825 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.186284 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.186265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k29qd\" (UniqueName: \"kubernetes.io/projected/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kube-api-access-k29qd\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.681965 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.681924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.684333 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.684312 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-b6dd86fb7-dtt45\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:21.911764 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:21.911718 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:22.031701 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:22.031677 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45"] Apr 24 22:14:22.033845 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:14:22.033782 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89afa2b2_7c81_4027_a3e9_2919a9b0b54e.slice/crio-7d6617731b5286bfc2675cc0dbf0f90c076f877da6c3398fd8baac4ec32380f8 WatchSource:0}: Error finding container 7d6617731b5286bfc2675cc0dbf0f90c076f877da6c3398fd8baac4ec32380f8: Status 404 returned error can't find the container with id 7d6617731b5286bfc2675cc0dbf0f90c076f877da6c3398fd8baac4ec32380f8 Apr 24 22:14:22.035796 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:22.035777 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:14:22.244098 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:22.243998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" event={"ID":"89afa2b2-7c81-4027-a3e9-2919a9b0b54e","Type":"ContainerStarted","Data":"59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4"} Apr 24 22:14:22.244098 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:22.244034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" event={"ID":"89afa2b2-7c81-4027-a3e9-2919a9b0b54e","Type":"ContainerStarted","Data":"7d6617731b5286bfc2675cc0dbf0f90c076f877da6c3398fd8baac4ec32380f8"} Apr 24 22:14:23.248504 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:23.248469 2578 generic.go:358] "Generic (PLEG): container finished" podID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerID="59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4" exitCode=0 Apr 24 22:14:23.248913 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:23.248570 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" event={"ID":"89afa2b2-7c81-4027-a3e9-2919a9b0b54e","Type":"ContainerDied","Data":"59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4"} Apr 24 22:14:24.253277 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:24.253246 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" event={"ID":"89afa2b2-7c81-4027-a3e9-2919a9b0b54e","Type":"ContainerStarted","Data":"77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e"} Apr 24 22:14:24.253624 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:24.253284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" event={"ID":"89afa2b2-7c81-4027-a3e9-2919a9b0b54e","Type":"ContainerStarted","Data":"4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c"} Apr 24 22:14:24.253624 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:24.253364 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:24.272044 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:24.272001 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podStartSLOduration=4.271983163 podStartE2EDuration="4.271983163s" podCreationTimestamp="2026-04-24 22:14:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:14:24.270182291 +0000 UTC m=+3488.237984353" watchObservedRunningTime="2026-04-24 22:14:24.271983163 +0000 UTC m=+3488.239785226" Apr 24 22:14:25.255866 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:25.255793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:25.257206 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:25.257176 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:14:26.258640 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:26.258602 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:14:31.262904 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:31.262873 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:14:31.263486 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:31.263460 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:14:41.263417 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:41.263376 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:14:51.263901 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:14:51.263857 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:15:01.263777 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:01.263722 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:15:11.263665 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:11.263626 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:15:21.264240 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:21.264211 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:15:31.093771 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.093721 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45"] Apr 24 22:15:31.094674 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.094588 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" containerID="cri-o://4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c" gracePeriod=30 Apr 24 22:15:31.094674 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.094616 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kube-rbac-proxy" containerID="cri-o://77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e" gracePeriod=30 Apr 24 22:15:31.250735 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.250701 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng"] Apr 24 22:15:31.254132 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.254111 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.256270 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.256247 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 24 22:15:31.256454 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.256430 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:15:31.256580 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.256491 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:15:31.259733 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.259698 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.59:8643/healthz\": dial tcp 10.134.0.59:8643: connect: connection refused" Apr 24 22:15:31.263645 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.263619 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng"] Apr 24 22:15:31.263767 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.263727 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:15:31.292870 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.292846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0e1961-65b6-4357-9133-88565494ba03-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.292937 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.292904 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd0e1961-65b6-4357-9133-88565494ba03-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.292937 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.292926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7xdw\" (UniqueName: \"kubernetes.io/projected/cd0e1961-65b6-4357-9133-88565494ba03-kube-api-access-h7xdw\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.293019 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.292959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.293019 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.292977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.394146 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.394057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0e1961-65b6-4357-9133-88565494ba03-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.394146 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.394112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd0e1961-65b6-4357-9133-88565494ba03-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.394146 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.394134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7xdw\" (UniqueName: \"kubernetes.io/projected/cd0e1961-65b6-4357-9133-88565494ba03-kube-api-access-h7xdw\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.394417 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.394158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.394417 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.394179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.394570 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.394550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0e1961-65b6-4357-9133-88565494ba03-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.394964 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.394934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.395063 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.394934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.397031 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.397004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd0e1961-65b6-4357-9133-88565494ba03-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.402086 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.402063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7xdw\" (UniqueName: \"kubernetes.io/projected/cd0e1961-65b6-4357-9133-88565494ba03-kube-api-access-h7xdw\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.446236 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.446210 2578 generic.go:358] "Generic (PLEG): container finished" podID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerID="77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e" exitCode=2 Apr 24 22:15:31.446340 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.446284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" event={"ID":"89afa2b2-7c81-4027-a3e9-2919a9b0b54e","Type":"ContainerDied","Data":"77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e"} Apr 24 22:15:31.565146 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.565118 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:31.686934 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:31.686909 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng"] Apr 24 22:15:31.689232 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:15:31.689197 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd0e1961_65b6_4357_9133_88565494ba03.slice/crio-c26169c05a98ec513a55178ec13c0ba08e60871df76b079de9bcac0f7b89f363 WatchSource:0}: Error finding container c26169c05a98ec513a55178ec13c0ba08e60871df76b079de9bcac0f7b89f363: Status 404 returned error can't find the container with id c26169c05a98ec513a55178ec13c0ba08e60871df76b079de9bcac0f7b89f363 Apr 24 22:15:32.451031 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:32.450998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" event={"ID":"cd0e1961-65b6-4357-9133-88565494ba03","Type":"ContainerStarted","Data":"9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0"} Apr 24 22:15:32.451031 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:32.451034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" event={"ID":"cd0e1961-65b6-4357-9133-88565494ba03","Type":"ContainerStarted","Data":"c26169c05a98ec513a55178ec13c0ba08e60871df76b079de9bcac0f7b89f363"} Apr 24 22:15:33.455061 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:33.455026 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd0e1961-65b6-4357-9133-88565494ba03" containerID="9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0" exitCode=0 Apr 24 22:15:33.455425 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:33.455101 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" event={"ID":"cd0e1961-65b6-4357-9133-88565494ba03","Type":"ContainerDied","Data":"9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0"} Apr 24 22:15:34.461724 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:34.461684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" event={"ID":"cd0e1961-65b6-4357-9133-88565494ba03","Type":"ContainerStarted","Data":"a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e"} Apr 24 22:15:34.461724 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:34.461722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" event={"ID":"cd0e1961-65b6-4357-9133-88565494ba03","Type":"ContainerStarted","Data":"dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa"} Apr 24 22:15:34.462262 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:34.461888 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:34.481714 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:34.481667 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podStartSLOduration=3.481656398 podStartE2EDuration="3.481656398s" podCreationTimestamp="2026-04-24 22:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:15:34.480203067 +0000 UTC m=+3558.448005129" watchObservedRunningTime="2026-04-24 22:15:34.481656398 +0000 UTC m=+3558.449458460" Apr 24 22:15:35.137810 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.137788 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:15:35.218998 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.218921 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kserve-provision-location\") pod \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " Apr 24 22:15:35.218998 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.218978 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls\") pod \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " Apr 24 22:15:35.219172 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.219028 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " Apr 24 22:15:35.219172 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.219057 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k29qd\" (UniqueName: \"kubernetes.io/projected/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kube-api-access-k29qd\") pod \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\" (UID: \"89afa2b2-7c81-4027-a3e9-2919a9b0b54e\") " Apr 24 22:15:35.219292 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.219271 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "89afa2b2-7c81-4027-a3e9-2919a9b0b54e" (UID: "89afa2b2-7c81-4027-a3e9-2919a9b0b54e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:15:35.219421 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.219392 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "89afa2b2-7c81-4027-a3e9-2919a9b0b54e" (UID: "89afa2b2-7c81-4027-a3e9-2919a9b0b54e"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:15:35.221206 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.221183 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kube-api-access-k29qd" (OuterVolumeSpecName: "kube-api-access-k29qd") pod "89afa2b2-7c81-4027-a3e9-2919a9b0b54e" (UID: "89afa2b2-7c81-4027-a3e9-2919a9b0b54e"). InnerVolumeSpecName "kube-api-access-k29qd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:15:35.221303 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.221206 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "89afa2b2-7c81-4027-a3e9-2919a9b0b54e" (UID: "89afa2b2-7c81-4027-a3e9-2919a9b0b54e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:15:35.319901 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.319877 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:15:35.319901 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.319900 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:15:35.320038 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.319910 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:15:35.320038 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.319920 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k29qd\" (UniqueName: \"kubernetes.io/projected/89afa2b2-7c81-4027-a3e9-2919a9b0b54e-kube-api-access-k29qd\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:15:35.466721 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.466687 2578 generic.go:358] "Generic (PLEG): container finished" podID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerID="4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c" exitCode=0 Apr 24 22:15:35.467098 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.466775 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" event={"ID":"89afa2b2-7c81-4027-a3e9-2919a9b0b54e","Type":"ContainerDied","Data":"4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c"} Apr 24 22:15:35.467098 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.466797 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" Apr 24 22:15:35.467098 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.466820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45" event={"ID":"89afa2b2-7c81-4027-a3e9-2919a9b0b54e","Type":"ContainerDied","Data":"7d6617731b5286bfc2675cc0dbf0f90c076f877da6c3398fd8baac4ec32380f8"} Apr 24 22:15:35.467098 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.466837 2578 scope.go:117] "RemoveContainer" containerID="77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e" Apr 24 22:15:35.467341 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.467320 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:35.468152 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.468122 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:15:35.475132 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.475111 2578 scope.go:117] "RemoveContainer" containerID="4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c" Apr 24 22:15:35.481934 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.481919 2578 scope.go:117] "RemoveContainer" containerID="59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4" Apr 24 22:15:35.488985 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.488968 2578 scope.go:117] "RemoveContainer" containerID="77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e" Apr 24 22:15:35.489272 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:15:35.489251 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e\": container with ID starting with 77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e not found: ID does not exist" containerID="77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e" Apr 24 22:15:35.489343 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.489280 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e"} err="failed to get container status \"77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e\": rpc error: code = NotFound desc = could not find container \"77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e\": container with ID starting with 77591b568ba34b5d6d3d872e77df286fd91cd9e1368fef18064f400b637ea82e not found: ID does not exist" Apr 24 22:15:35.489343 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.489301 2578 scope.go:117] "RemoveContainer" containerID="4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c" Apr 24 22:15:35.489551 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:15:35.489535 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c\": container with ID starting with 4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c not found: ID does not exist" containerID="4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c" Apr 24 22:15:35.489608 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.489556 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c"} err="failed to get container status \"4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c\": rpc error: code = NotFound desc = could not find container \"4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c\": container with ID starting with 4b288fe40086aeb850cb362af656aefa99f2ef559c0fa2f786222291a918fa9c not found: ID does not exist" Apr 24 22:15:35.489608 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.489571 2578 scope.go:117] "RemoveContainer" containerID="59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4" Apr 24 22:15:35.489814 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:15:35.489791 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4\": container with ID starting with 59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4 not found: ID does not exist" containerID="59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4" Apr 24 22:15:35.489906 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.489814 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4"} err="failed to get container status \"59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4\": rpc error: code = NotFound desc = could not find container \"59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4\": container with ID starting with 59a09f95519ba1af04dbd107bba946d44fa81c655ef8c18627e0f741ac6d33d4 not found: ID does not exist" Apr 24 22:15:35.490577 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.490559 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45"] Apr 24 22:15:35.492706 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:35.492684 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-b6dd86fb7-dtt45"] Apr 24 22:15:36.470869 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:36.470828 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:15:36.659986 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:36.659953 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" path="/var/lib/kubelet/pods/89afa2b2-7c81-4027-a3e9-2919a9b0b54e/volumes" Apr 24 22:15:41.475797 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:41.475767 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:15:41.476401 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:41.476372 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:15:51.476257 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:15:51.476219 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:16:01.476717 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:01.476681 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:16:11.477291 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:11.477206 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:16:16.803417 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:16.803388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:16:16.809023 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:16.809000 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:16:21.476377 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:21.476339 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:16:31.476363 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:31.476325 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:16:41.476926 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:41.476893 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:16:51.246531 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:51.246491 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng"] Apr 24 22:16:51.247022 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:51.246813 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" containerID="cri-o://dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa" gracePeriod=30 Apr 24 22:16:51.247022 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:51.246869 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kube-rbac-proxy" containerID="cri-o://a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e" gracePeriod=30 Apr 24 22:16:51.471086 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:51.471043 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.60:8643/healthz\": dial tcp 10.134.0.60:8643: connect: connection refused" Apr 24 22:16:51.476330 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:51.476303 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:16:51.677556 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:51.677528 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd0e1961-65b6-4357-9133-88565494ba03" containerID="a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e" exitCode=2 Apr 24 22:16:51.677709 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:51.677581 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" event={"ID":"cd0e1961-65b6-4357-9133-88565494ba03","Type":"ContainerDied","Data":"a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e"} Apr 24 22:16:52.330275 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330236 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2"] Apr 24 22:16:52.330634 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330625 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="storage-initializer" Apr 24 22:16:52.330680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330640 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="storage-initializer" Apr 24 22:16:52.330680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330662 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" Apr 24 22:16:52.330680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330667 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" Apr 24 22:16:52.330680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330680 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kube-rbac-proxy" Apr 24 22:16:52.330823 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330685 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kube-rbac-proxy" Apr 24 22:16:52.330823 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330762 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kserve-container" Apr 24 22:16:52.330823 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.330773 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="89afa2b2-7c81-4027-a3e9-2919a9b0b54e" containerName="kube-rbac-proxy" Apr 24 22:16:52.334011 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.333993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.336570 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.336545 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 24 22:16:52.336693 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.336554 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:16:52.343190 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.343168 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2"] Apr 24 22:16:52.498094 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.498067 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.498207 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.498097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.498207 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.498128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc6sw\" (UniqueName: \"kubernetes.io/projected/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kube-api-access-gc6sw\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.498207 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.498194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.598902 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.598824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.598902 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.598863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.598902 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.598899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gc6sw\" (UniqueName: \"kubernetes.io/projected/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kube-api-access-gc6sw\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.599142 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.598929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.599142 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:16:52.598949 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 24 22:16:52.599142 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:16:52.599005 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls podName:5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5 nodeName:}" failed. No retries permitted until 2026-04-24 22:16:53.098989313 +0000 UTC m=+3637.066791353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls") pod "isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" (UID: "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5") : secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 24 22:16:52.599315 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.599297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.599560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.599543 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:52.607382 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:52.607356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc6sw\" (UniqueName: \"kubernetes.io/projected/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kube-api-access-gc6sw\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:53.103013 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:53.102975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:53.105375 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:53.105355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:53.244962 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:53.244932 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:16:53.365878 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:53.365854 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2"] Apr 24 22:16:53.368287 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:16:53.368255 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9f72c2_f4f3_4304_a1e9_d6eabb45b6d5.slice/crio-d8ed019f39c0d6a815762413107836e662beb1e0fc4324cfc3ab7f78e71f4b42 WatchSource:0}: Error finding container d8ed019f39c0d6a815762413107836e662beb1e0fc4324cfc3ab7f78e71f4b42: Status 404 returned error can't find the container with id d8ed019f39c0d6a815762413107836e662beb1e0fc4324cfc3ab7f78e71f4b42 Apr 24 22:16:53.690184 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:53.690145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" event={"ID":"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5","Type":"ContainerStarted","Data":"1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d"} Apr 24 22:16:53.690184 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:53.690184 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" event={"ID":"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5","Type":"ContainerStarted","Data":"d8ed019f39c0d6a815762413107836e662beb1e0fc4324cfc3ab7f78e71f4b42"} Apr 24 22:16:55.088538 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.088519 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:16:55.220113 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.220055 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd0e1961-65b6-4357-9133-88565494ba03-proxy-tls\") pod \"cd0e1961-65b6-4357-9133-88565494ba03\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " Apr 24 22:16:55.220113 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.220099 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0e1961-65b6-4357-9133-88565494ba03-kserve-provision-location\") pod \"cd0e1961-65b6-4357-9133-88565494ba03\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " Apr 24 22:16:55.220256 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.220138 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-cabundle-cert\") pod \"cd0e1961-65b6-4357-9133-88565494ba03\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " Apr 24 22:16:55.220256 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.220194 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7xdw\" (UniqueName: \"kubernetes.io/projected/cd0e1961-65b6-4357-9133-88565494ba03-kube-api-access-h7xdw\") pod \"cd0e1961-65b6-4357-9133-88565494ba03\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " Apr 24 22:16:55.220256 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.220216 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"cd0e1961-65b6-4357-9133-88565494ba03\" (UID: \"cd0e1961-65b6-4357-9133-88565494ba03\") " Apr 24 22:16:55.220545 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.220515 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd0e1961-65b6-4357-9133-88565494ba03-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cd0e1961-65b6-4357-9133-88565494ba03" (UID: "cd0e1961-65b6-4357-9133-88565494ba03"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:55.220545 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.220536 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "cd0e1961-65b6-4357-9133-88565494ba03" (UID: "cd0e1961-65b6-4357-9133-88565494ba03"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:16:55.220674 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.220590 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "cd0e1961-65b6-4357-9133-88565494ba03" (UID: "cd0e1961-65b6-4357-9133-88565494ba03"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:16:55.222235 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.222218 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0e1961-65b6-4357-9133-88565494ba03-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cd0e1961-65b6-4357-9133-88565494ba03" (UID: "cd0e1961-65b6-4357-9133-88565494ba03"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:55.222333 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.222315 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd0e1961-65b6-4357-9133-88565494ba03-kube-api-access-h7xdw" (OuterVolumeSpecName: "kube-api-access-h7xdw") pod "cd0e1961-65b6-4357-9133-88565494ba03" (UID: "cd0e1961-65b6-4357-9133-88565494ba03"). InnerVolumeSpecName "kube-api-access-h7xdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:55.321284 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.321257 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-cabundle-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:16:55.321284 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.321281 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7xdw\" (UniqueName: \"kubernetes.io/projected/cd0e1961-65b6-4357-9133-88565494ba03-kube-api-access-h7xdw\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:16:55.321418 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.321298 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cd0e1961-65b6-4357-9133-88565494ba03-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:16:55.321418 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.321313 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd0e1961-65b6-4357-9133-88565494ba03-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:16:55.321418 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.321326 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0e1961-65b6-4357-9133-88565494ba03-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:16:55.697936 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.697902 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd0e1961-65b6-4357-9133-88565494ba03" containerID="dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa" exitCode=0 Apr 24 22:16:55.698103 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.697945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" event={"ID":"cd0e1961-65b6-4357-9133-88565494ba03","Type":"ContainerDied","Data":"dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa"} Apr 24 22:16:55.698103 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.697972 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" event={"ID":"cd0e1961-65b6-4357-9133-88565494ba03","Type":"ContainerDied","Data":"c26169c05a98ec513a55178ec13c0ba08e60871df76b079de9bcac0f7b89f363"} Apr 24 22:16:55.698103 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.697981 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng" Apr 24 22:16:55.698103 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.697988 2578 scope.go:117] "RemoveContainer" containerID="a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e" Apr 24 22:16:55.707472 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.707455 2578 scope.go:117] "RemoveContainer" containerID="dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa" Apr 24 22:16:55.714368 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.714347 2578 scope.go:117] "RemoveContainer" containerID="9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0" Apr 24 22:16:55.719646 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.719619 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng"] Apr 24 22:16:55.721643 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.721624 2578 scope.go:117] "RemoveContainer" containerID="a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e" Apr 24 22:16:55.721942 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:16:55.721920 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e\": container with ID starting with a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e not found: ID does not exist" containerID="a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e" Apr 24 22:16:55.722036 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.721953 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e"} err="failed to get container status \"a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e\": rpc error: code = NotFound desc = could not find container \"a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e\": container with ID starting with a2293c6df190aa202344b8da2f499a1263c5a2cd7d94589b9b2fbeaa46d83a1e not found: ID does not exist" Apr 24 22:16:55.722036 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.721976 2578 scope.go:117] "RemoveContainer" containerID="dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa" Apr 24 22:16:55.722257 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:16:55.722235 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa\": container with ID starting with dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa not found: ID does not exist" containerID="dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa" Apr 24 22:16:55.722313 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.722268 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa"} err="failed to get container status \"dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa\": rpc error: code = NotFound desc = could not find container \"dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa\": container with ID starting with dc6788a201a2d85850442bd60e542d5b9700775eb484c779e663500ffdb3a1fa not found: ID does not exist" Apr 24 22:16:55.722313 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.722289 2578 scope.go:117] "RemoveContainer" containerID="9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0" Apr 24 22:16:55.722558 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:16:55.722536 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0\": container with ID starting with 9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0 not found: ID does not exist" containerID="9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0" Apr 24 22:16:55.722606 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.722570 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0"} err="failed to get container status \"9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0\": rpc error: code = NotFound desc = could not find container \"9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0\": container with ID starting with 9e6689ed442d6880c415ffd956bd9a31a2ab01ac9a96fa2a117a0d3a97414bc0 not found: ID does not exist" Apr 24 22:16:55.723212 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:55.723195 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-688df9b567-6kwng"] Apr 24 22:16:56.659152 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:56.659121 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd0e1961-65b6-4357-9133-88565494ba03" path="/var/lib/kubelet/pods/cd0e1961-65b6-4357-9133-88565494ba03/volumes" Apr 24 22:16:57.706756 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:57.706731 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2_5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5/storage-initializer/0.log" Apr 24 22:16:57.707053 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:57.706792 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerID="1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d" exitCode=1 Apr 24 22:16:57.707053 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:57.706848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" event={"ID":"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5","Type":"ContainerDied","Data":"1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d"} Apr 24 22:16:58.711331 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:58.711303 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2_5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5/storage-initializer/0.log" Apr 24 22:16:58.711697 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:16:58.711390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" event={"ID":"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5","Type":"ContainerStarted","Data":"c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da"} Apr 24 22:17:02.316909 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:02.316876 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2"] Apr 24 22:17:02.317354 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:02.317156 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" podUID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerName="storage-initializer" containerID="cri-o://c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da" gracePeriod=30 Apr 24 22:17:03.401723 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.401685 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6"] Apr 24 22:17:03.402257 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.402205 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" Apr 24 22:17:03.402257 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.402225 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" Apr 24 22:17:03.402257 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.402251 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="storage-initializer" Apr 24 22:17:03.402418 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.402259 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="storage-initializer" Apr 24 22:17:03.402418 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.402279 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kube-rbac-proxy" Apr 24 22:17:03.402418 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.402288 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kube-rbac-proxy" Apr 24 22:17:03.402418 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.402370 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kube-rbac-proxy" Apr 24 22:17:03.402418 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.402385 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd0e1961-65b6-4357-9133-88565494ba03" containerName="kserve-container" Apr 24 22:17:03.405860 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.405838 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.408370 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.408347 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 24 22:17:03.408521 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.408390 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:17:03.408521 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.408438 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:17:03.415046 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.414995 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6"] Apr 24 22:17:03.472541 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.472510 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.472724 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.472562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.472724 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.472593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.472854 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.472731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.472854 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.472825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlbg\" (UniqueName: \"kubernetes.io/projected/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kube-api-access-xzlbg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.573241 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.573211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.573241 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.573247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.573487 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.573266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.573487 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.573312 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.573487 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.573341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlbg\" (UniqueName: \"kubernetes.io/projected/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kube-api-access-xzlbg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.573876 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.573850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.574165 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.574002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.574165 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.574023 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.575740 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.575720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.583209 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.582640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlbg\" (UniqueName: \"kubernetes.io/projected/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kube-api-access-xzlbg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.655179 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.655126 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2_5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5/storage-initializer/1.log" Apr 24 22:17:03.655527 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.655491 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2_5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5/storage-initializer/0.log" Apr 24 22:17:03.655595 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.655571 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:17:03.673655 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.673619 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc6sw\" (UniqueName: \"kubernetes.io/projected/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kube-api-access-gc6sw\") pod \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " Apr 24 22:17:03.673803 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.673659 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls\") pod \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " Apr 24 22:17:03.673803 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.673679 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kserve-provision-location\") pod \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " Apr 24 22:17:03.673923 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.673803 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\" (UID: \"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5\") " Apr 24 22:17:03.674454 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.674247 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" (UID: "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:17:03.674608 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.674579 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" (UID: "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:17:03.676446 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.676422 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" (UID: "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:17:03.676536 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.676515 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kube-api-access-gc6sw" (OuterVolumeSpecName: "kube-api-access-gc6sw") pod "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" (UID: "5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5"). InnerVolumeSpecName "kube-api-access-gc6sw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:17:03.719424 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.719399 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:03.725191 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.725176 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2_5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5/storage-initializer/1.log" Apr 24 22:17:03.725544 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.725530 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2_5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5/storage-initializer/0.log" Apr 24 22:17:03.725588 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.725564 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerID="c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da" exitCode=1 Apr 24 22:17:03.725635 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.725591 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" event={"ID":"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5","Type":"ContainerDied","Data":"c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da"} Apr 24 22:17:03.725670 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.725632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" event={"ID":"5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5","Type":"ContainerDied","Data":"d8ed019f39c0d6a815762413107836e662beb1e0fc4324cfc3ab7f78e71f4b42"} Apr 24 22:17:03.725670 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.725652 2578 scope.go:117] "RemoveContainer" containerID="c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da" Apr 24 22:17:03.725770 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.725672 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2" Apr 24 22:17:03.734533 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.734505 2578 scope.go:117] "RemoveContainer" containerID="1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d" Apr 24 22:17:03.741479 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.741463 2578 scope.go:117] "RemoveContainer" containerID="c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da" Apr 24 22:17:03.741818 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:17:03.741789 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da\": container with ID starting with c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da not found: ID does not exist" containerID="c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da" Apr 24 22:17:03.741906 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.741823 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da"} err="failed to get container status \"c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da\": rpc error: code = NotFound desc = could not find container \"c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da\": container with ID starting with c5ef46778342bd1e023576944c39df7057054be9e0b0627430c0fc3ce71445da not found: ID does not exist" Apr 24 22:17:03.741906 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.741838 2578 scope.go:117] "RemoveContainer" containerID="1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d" Apr 24 22:17:03.742102 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:17:03.742085 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d\": container with ID starting with 1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d not found: ID does not exist" containerID="1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d" Apr 24 22:17:03.742147 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.742108 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d"} err="failed to get container status \"1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d\": rpc error: code = NotFound desc = could not find container \"1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d\": container with ID starting with 1e4d1dab086a11cd9df42fb4667bcdb1ae5a37f0104c86c06692933d645c1e0d not found: ID does not exist" Apr 24 22:17:03.761187 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.761158 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2"] Apr 24 22:17:03.765849 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.765826 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-548d449c8f-rpfk2"] Apr 24 22:17:03.775324 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.775299 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:17:03.775436 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.775328 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gc6sw\" (UniqueName: \"kubernetes.io/projected/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kube-api-access-gc6sw\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:17:03.775436 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.775343 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:17:03.775436 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.775381 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:17:03.845956 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:03.845926 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6"] Apr 24 22:17:03.850562 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:17:03.850540 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac44fff7_fdd4_41ac_8ebc_a320f09599cd.slice/crio-a9a94fc9ed91fbeca2900380f96bb2971f5a7310006d7f9b65a00a7fb3c83d85 WatchSource:0}: Error finding container a9a94fc9ed91fbeca2900380f96bb2971f5a7310006d7f9b65a00a7fb3c83d85: Status 404 returned error can't find the container with id a9a94fc9ed91fbeca2900380f96bb2971f5a7310006d7f9b65a00a7fb3c83d85 Apr 24 22:17:04.660714 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:04.660682 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" path="/var/lib/kubelet/pods/5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5/volumes" Apr 24 22:17:04.730290 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:04.730245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" event={"ID":"ac44fff7-fdd4-41ac-8ebc-a320f09599cd","Type":"ContainerStarted","Data":"dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9"} Apr 24 22:17:04.730290 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:04.730289 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" event={"ID":"ac44fff7-fdd4-41ac-8ebc-a320f09599cd","Type":"ContainerStarted","Data":"a9a94fc9ed91fbeca2900380f96bb2971f5a7310006d7f9b65a00a7fb3c83d85"} Apr 24 22:17:05.735403 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:05.735362 2578 generic.go:358] "Generic (PLEG): container finished" podID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerID="dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9" exitCode=0 Apr 24 22:17:05.735791 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:05.735418 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" event={"ID":"ac44fff7-fdd4-41ac-8ebc-a320f09599cd","Type":"ContainerDied","Data":"dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9"} Apr 24 22:17:06.740687 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:06.740650 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" event={"ID":"ac44fff7-fdd4-41ac-8ebc-a320f09599cd","Type":"ContainerStarted","Data":"7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338"} Apr 24 22:17:06.740687 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:06.740687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" event={"ID":"ac44fff7-fdd4-41ac-8ebc-a320f09599cd","Type":"ContainerStarted","Data":"1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5"} Apr 24 22:17:06.741087 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:06.740845 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:06.763344 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:06.763298 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podStartSLOduration=3.763286421 podStartE2EDuration="3.763286421s" podCreationTimestamp="2026-04-24 22:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:17:06.761515862 +0000 UTC m=+3650.729317924" watchObservedRunningTime="2026-04-24 22:17:06.763286421 +0000 UTC m=+3650.731088482" Apr 24 22:17:07.744280 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:07.744244 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:07.745469 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:07.745439 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:17:08.746991 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:08.746942 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:17:13.750859 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:13.750831 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:17:13.751403 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:13.751376 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:17:23.752222 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:23.752187 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:17:33.751898 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:33.751861 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:17:43.751977 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:43.751937 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:17:53.752185 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:17:53.752140 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:18:03.752734 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:03.752710 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:18:13.413958 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:13.413922 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6"] Apr 24 22:18:13.414356 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:13.414252 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" containerID="cri-o://1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5" gracePeriod=30 Apr 24 22:18:13.414356 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:13.414291 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kube-rbac-proxy" containerID="cri-o://7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338" gracePeriod=30 Apr 24 22:18:13.747863 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:13.747734 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.62:8643/healthz\": dial tcp 10.134.0.62:8643: connect: connection refused" Apr 24 22:18:13.752182 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:13.752155 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:18:13.929820 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:13.929786 2578 generic.go:358] "Generic (PLEG): container finished" podID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerID="7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338" exitCode=2 Apr 24 22:18:13.929979 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:13.929863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" event={"ID":"ac44fff7-fdd4-41ac-8ebc-a320f09599cd","Type":"ContainerDied","Data":"7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338"} Apr 24 22:18:14.498232 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.498201 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl"] Apr 24 22:18:14.498602 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.498544 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerName="storage-initializer" Apr 24 22:18:14.498602 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.498556 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerName="storage-initializer" Apr 24 22:18:14.498602 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.498566 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerName="storage-initializer" Apr 24 22:18:14.498602 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.498572 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerName="storage-initializer" Apr 24 22:18:14.498736 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.498632 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerName="storage-initializer" Apr 24 22:18:14.498736 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.498642 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a9f72c2-f4f3-4304-a1e9-d6eabb45b6d5" containerName="storage-initializer" Apr 24 22:18:14.503923 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.502426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.505321 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.505297 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 24 22:18:14.505527 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.505313 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:18:14.512128 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.512106 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl"] Apr 24 22:18:14.594285 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.594254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhrdw\" (UniqueName: \"kubernetes.io/projected/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kube-api-access-zhrdw\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.594445 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.594301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.594445 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.594384 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.594445 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.594432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.694844 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.694813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhrdw\" (UniqueName: \"kubernetes.io/projected/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kube-api-access-zhrdw\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.695024 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.694871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.695024 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.694904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.695024 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.694945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.695024 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:18:14.695001 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 24 22:18:14.695259 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:18:14.695094 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls podName:9b85d6f3-d103-43ba-9ce9-53fe8803ad67 nodeName:}" failed. No retries permitted until 2026-04-24 22:18:15.195060608 +0000 UTC m=+3719.162862665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls") pod "isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" (UID: "9b85d6f3-d103-43ba-9ce9-53fe8803ad67") : secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 24 22:18:14.695259 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.695199 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.695542 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.695524 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:14.703645 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:14.703626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhrdw\" (UniqueName: \"kubernetes.io/projected/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kube-api-access-zhrdw\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:15.199203 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:15.199164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:15.201713 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:15.201687 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:15.416382 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:15.416350 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:15.540157 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:15.540131 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl"] Apr 24 22:18:15.542819 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:18:15.542790 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b85d6f3_d103_43ba_9ce9_53fe8803ad67.slice/crio-ad8361d7b79ae6c13310d8a428e3a1c15987777d574575f4b5c87b18c8110bc9 WatchSource:0}: Error finding container ad8361d7b79ae6c13310d8a428e3a1c15987777d574575f4b5c87b18c8110bc9: Status 404 returned error can't find the container with id ad8361d7b79ae6c13310d8a428e3a1c15987777d574575f4b5c87b18c8110bc9 Apr 24 22:18:15.937194 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:15.937151 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" event={"ID":"9b85d6f3-d103-43ba-9ce9-53fe8803ad67","Type":"ContainerStarted","Data":"b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b"} Apr 24 22:18:15.937194 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:15.937188 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" event={"ID":"9b85d6f3-d103-43ba-9ce9-53fe8803ad67","Type":"ContainerStarted","Data":"ad8361d7b79ae6c13310d8a428e3a1c15987777d574575f4b5c87b18c8110bc9"} Apr 24 22:18:17.654326 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.654302 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:18:17.820638 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.820558 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kserve-provision-location\") pod \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " Apr 24 22:18:17.820638 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.820600 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzlbg\" (UniqueName: \"kubernetes.io/projected/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kube-api-access-xzlbg\") pod \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " Apr 24 22:18:17.820893 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.820656 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-cabundle-cert\") pod \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " Apr 24 22:18:17.820893 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.820682 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-proxy-tls\") pod \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " Apr 24 22:18:17.820893 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.820706 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\" (UID: \"ac44fff7-fdd4-41ac-8ebc-a320f09599cd\") " Apr 24 22:18:17.821033 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.820942 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ac44fff7-fdd4-41ac-8ebc-a320f09599cd" (UID: "ac44fff7-fdd4-41ac-8ebc-a320f09599cd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:18:17.821071 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.821056 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "ac44fff7-fdd4-41ac-8ebc-a320f09599cd" (UID: "ac44fff7-fdd4-41ac-8ebc-a320f09599cd"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:18:17.821108 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.821079 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ac44fff7-fdd4-41ac-8ebc-a320f09599cd" (UID: "ac44fff7-fdd4-41ac-8ebc-a320f09599cd"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:18:17.822942 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.822916 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ac44fff7-fdd4-41ac-8ebc-a320f09599cd" (UID: "ac44fff7-fdd4-41ac-8ebc-a320f09599cd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:18:17.823034 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.822942 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kube-api-access-xzlbg" (OuterVolumeSpecName: "kube-api-access-xzlbg") pod "ac44fff7-fdd4-41ac-8ebc-a320f09599cd" (UID: "ac44fff7-fdd4-41ac-8ebc-a320f09599cd"). InnerVolumeSpecName "kube-api-access-xzlbg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:18:17.922018 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.921988 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-cabundle-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:17.922018 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.922016 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:17.922018 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.922025 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:17.922214 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.922035 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:17.922214 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.922045 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzlbg\" (UniqueName: \"kubernetes.io/projected/ac44fff7-fdd4-41ac-8ebc-a320f09599cd-kube-api-access-xzlbg\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:17.950650 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.950618 2578 generic.go:358] "Generic (PLEG): container finished" podID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerID="1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5" exitCode=0 Apr 24 22:18:17.950811 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.950666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" event={"ID":"ac44fff7-fdd4-41ac-8ebc-a320f09599cd","Type":"ContainerDied","Data":"1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5"} Apr 24 22:18:17.950811 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.950687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" event={"ID":"ac44fff7-fdd4-41ac-8ebc-a320f09599cd","Type":"ContainerDied","Data":"a9a94fc9ed91fbeca2900380f96bb2971f5a7310006d7f9b65a00a7fb3c83d85"} Apr 24 22:18:17.950811 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.950701 2578 scope.go:117] "RemoveContainer" containerID="7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338" Apr 24 22:18:17.950811 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.950720 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6" Apr 24 22:18:17.959061 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.959045 2578 scope.go:117] "RemoveContainer" containerID="1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5" Apr 24 22:18:17.965886 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.965865 2578 scope.go:117] "RemoveContainer" containerID="dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9" Apr 24 22:18:17.972845 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.972821 2578 scope.go:117] "RemoveContainer" containerID="7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338" Apr 24 22:18:17.973065 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.973045 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6"] Apr 24 22:18:17.973133 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:18:17.973080 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338\": container with ID starting with 7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338 not found: ID does not exist" containerID="7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338" Apr 24 22:18:17.973133 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.973109 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338"} err="failed to get container status \"7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338\": rpc error: code = NotFound desc = could not find container \"7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338\": container with ID starting with 7befe16b7b668d7e4c2bb0c70ff6a93efd2f80ea95cb2274865de17a8a059338 not found: ID does not exist" Apr 24 22:18:17.973133 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.973128 2578 scope.go:117] "RemoveContainer" containerID="1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5" Apr 24 22:18:17.973362 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:18:17.973346 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5\": container with ID starting with 1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5 not found: ID does not exist" containerID="1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5" Apr 24 22:18:17.973402 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.973368 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5"} err="failed to get container status \"1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5\": rpc error: code = NotFound desc = could not find container \"1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5\": container with ID starting with 1ccb5bed107dbdba85727c8183f98936148dcc365b4a51e8013eac458f3124d5 not found: ID does not exist" Apr 24 22:18:17.973402 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.973383 2578 scope.go:117] "RemoveContainer" containerID="dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9" Apr 24 22:18:17.973599 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:18:17.973581 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9\": container with ID starting with dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9 not found: ID does not exist" containerID="dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9" Apr 24 22:18:17.973658 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.973608 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9"} err="failed to get container status \"dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9\": rpc error: code = NotFound desc = could not find container \"dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9\": container with ID starting with dccb1c0e50391b55c5c8535f5b631d2dff222cc9c06f90ce84d6ccecd29566b9 not found: ID does not exist" Apr 24 22:18:17.976258 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:17.976240 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-78b57684f9-pzzw6"] Apr 24 22:18:18.660465 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:18.660432 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" path="/var/lib/kubelet/pods/ac44fff7-fdd4-41ac-8ebc-a320f09599cd/volumes" Apr 24 22:18:19.957612 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:19.957586 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl_9b85d6f3-d103-43ba-9ce9-53fe8803ad67/storage-initializer/0.log" Apr 24 22:18:19.957983 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:19.957625 2578 generic.go:358] "Generic (PLEG): container finished" podID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerID="b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b" exitCode=1 Apr 24 22:18:19.957983 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:19.957700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" event={"ID":"9b85d6f3-d103-43ba-9ce9-53fe8803ad67","Type":"ContainerDied","Data":"b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b"} Apr 24 22:18:20.961654 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:20.961631 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl_9b85d6f3-d103-43ba-9ce9-53fe8803ad67/storage-initializer/0.log" Apr 24 22:18:20.962038 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:20.961713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" event={"ID":"9b85d6f3-d103-43ba-9ce9-53fe8803ad67","Type":"ContainerStarted","Data":"82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63"} Apr 24 22:18:24.486552 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:24.486510 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl"] Apr 24 22:18:24.487007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:24.486787 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" podUID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerName="storage-initializer" containerID="cri-o://82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63" gracePeriod=30 Apr 24 22:18:25.574296 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574263 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2"] Apr 24 22:18:25.574680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574628 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="storage-initializer" Apr 24 22:18:25.574680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574639 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="storage-initializer" Apr 24 22:18:25.574680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574650 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kube-rbac-proxy" Apr 24 22:18:25.574680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574655 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kube-rbac-proxy" Apr 24 22:18:25.574680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574667 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" Apr 24 22:18:25.574680 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574672 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" Apr 24 22:18:25.574916 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574730 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kserve-container" Apr 24 22:18:25.574916 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.574741 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac44fff7-fdd4-41ac-8ebc-a320f09599cd" containerName="kube-rbac-proxy" Apr 24 22:18:25.577934 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.577913 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.580391 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.580370 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 24 22:18:25.580495 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.580379 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:18:25.580495 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.580380 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:18:25.588973 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.588952 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2"] Apr 24 22:18:25.682160 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.682132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksgj\" (UniqueName: \"kubernetes.io/projected/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kube-api-access-2ksgj\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.682298 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.682172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.682298 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.682208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.682298 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.682283 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.682410 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.682334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.783137 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.783109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksgj\" (UniqueName: \"kubernetes.io/projected/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kube-api-access-2ksgj\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.783421 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.783382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.783630 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.783603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.783779 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.783680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.783779 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.783736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.784197 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.784130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.784346 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.784325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.784469 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.784406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.786014 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.785994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.791434 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.791409 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksgj\" (UniqueName: \"kubernetes.io/projected/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kube-api-access-2ksgj\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:25.888655 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:25.888581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:26.015153 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.015113 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2"] Apr 24 22:18:26.022075 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:18:26.022049 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod151ccca9_3ef0_4c70_8ed7_32bf851be5ba.slice/crio-9c6b23afd43702e98e8540be4b571114cf407cd86ea026cb9b1d5f576161f2d4 WatchSource:0}: Error finding container 9c6b23afd43702e98e8540be4b571114cf407cd86ea026cb9b1d5f576161f2d4: Status 404 returned error can't find the container with id 9c6b23afd43702e98e8540be4b571114cf407cd86ea026cb9b1d5f576161f2d4 Apr 24 22:18:26.117007 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.116985 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl_9b85d6f3-d103-43ba-9ce9-53fe8803ad67/storage-initializer/1.log" Apr 24 22:18:26.117396 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.117377 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl_9b85d6f3-d103-43ba-9ce9-53fe8803ad67/storage-initializer/0.log" Apr 24 22:18:26.117478 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.117464 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:26.288571 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.288539 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kserve-provision-location\") pod \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " Apr 24 22:18:26.288742 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.288629 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " Apr 24 22:18:26.288742 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.288663 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls\") pod \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " Apr 24 22:18:26.288896 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.288812 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9b85d6f3-d103-43ba-9ce9-53fe8803ad67" (UID: "9b85d6f3-d103-43ba-9ce9-53fe8803ad67"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:18:26.288896 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.288829 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhrdw\" (UniqueName: \"kubernetes.io/projected/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kube-api-access-zhrdw\") pod \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\" (UID: \"9b85d6f3-d103-43ba-9ce9-53fe8803ad67\") " Apr 24 22:18:26.289014 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.288993 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "9b85d6f3-d103-43ba-9ce9-53fe8803ad67" (UID: "9b85d6f3-d103-43ba-9ce9-53fe8803ad67"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:18:26.289075 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.289069 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:26.289117 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.289081 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:26.290871 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.290851 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9b85d6f3-d103-43ba-9ce9-53fe8803ad67" (UID: "9b85d6f3-d103-43ba-9ce9-53fe8803ad67"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:18:26.290932 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.290900 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kube-api-access-zhrdw" (OuterVolumeSpecName: "kube-api-access-zhrdw") pod "9b85d6f3-d103-43ba-9ce9-53fe8803ad67" (UID: "9b85d6f3-d103-43ba-9ce9-53fe8803ad67"). InnerVolumeSpecName "kube-api-access-zhrdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:18:26.389568 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.389542 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:26.389568 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.389565 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zhrdw\" (UniqueName: \"kubernetes.io/projected/9b85d6f3-d103-43ba-9ce9-53fe8803ad67-kube-api-access-zhrdw\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:18:26.981266 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.981177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" event={"ID":"151ccca9-3ef0-4c70-8ed7-32bf851be5ba","Type":"ContainerStarted","Data":"27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235"} Apr 24 22:18:26.981266 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.981229 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" event={"ID":"151ccca9-3ef0-4c70-8ed7-32bf851be5ba","Type":"ContainerStarted","Data":"9c6b23afd43702e98e8540be4b571114cf407cd86ea026cb9b1d5f576161f2d4"} Apr 24 22:18:26.982382 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.982362 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl_9b85d6f3-d103-43ba-9ce9-53fe8803ad67/storage-initializer/1.log" Apr 24 22:18:26.982736 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.982721 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl_9b85d6f3-d103-43ba-9ce9-53fe8803ad67/storage-initializer/0.log" Apr 24 22:18:26.982807 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.982776 2578 generic.go:358] "Generic (PLEG): container finished" podID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerID="82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63" exitCode=1 Apr 24 22:18:26.982854 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.982823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" event={"ID":"9b85d6f3-d103-43ba-9ce9-53fe8803ad67","Type":"ContainerDied","Data":"82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63"} Apr 24 22:18:26.982886 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.982854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" event={"ID":"9b85d6f3-d103-43ba-9ce9-53fe8803ad67","Type":"ContainerDied","Data":"ad8361d7b79ae6c13310d8a428e3a1c15987777d574575f4b5c87b18c8110bc9"} Apr 24 22:18:26.982886 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.982873 2578 scope.go:117] "RemoveContainer" containerID="82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63" Apr 24 22:18:26.982950 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.982905 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl" Apr 24 22:18:26.992008 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:26.991977 2578 scope.go:117] "RemoveContainer" containerID="b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b" Apr 24 22:18:27.000303 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:27.000285 2578 scope.go:117] "RemoveContainer" containerID="82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63" Apr 24 22:18:27.000569 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:18:27.000540 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63\": container with ID starting with 82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63 not found: ID does not exist" containerID="82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63" Apr 24 22:18:27.000655 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:27.000571 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63"} err="failed to get container status \"82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63\": rpc error: code = NotFound desc = could not find container \"82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63\": container with ID starting with 82f749cb5697ce00e19c6e6bd59932cdcfbacc2535ded262d254b472ae815f63 not found: ID does not exist" Apr 24 22:18:27.000655 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:27.000595 2578 scope.go:117] "RemoveContainer" containerID="b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b" Apr 24 22:18:27.001005 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:18:27.000985 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b\": container with ID starting with b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b not found: ID does not exist" containerID="b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b" Apr 24 22:18:27.001080 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:27.001009 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b"} err="failed to get container status \"b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b\": rpc error: code = NotFound desc = could not find container \"b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b\": container with ID starting with b425db35cd3a2bd8dd321a056137f84053f6a408c063344bd6658b34da6de19b not found: ID does not exist" Apr 24 22:18:27.024685 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:27.024654 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl"] Apr 24 22:18:27.029116 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:27.029081 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5cbcb57d85-66dfl"] Apr 24 22:18:27.987415 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:27.987381 2578 generic.go:358] "Generic (PLEG): container finished" podID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerID="27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235" exitCode=0 Apr 24 22:18:27.987873 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:27.987468 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" event={"ID":"151ccca9-3ef0-4c70-8ed7-32bf851be5ba","Type":"ContainerDied","Data":"27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235"} Apr 24 22:18:28.659642 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:28.659610 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" path="/var/lib/kubelet/pods/9b85d6f3-d103-43ba-9ce9-53fe8803ad67/volumes" Apr 24 22:18:28.998258 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:28.998168 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" event={"ID":"151ccca9-3ef0-4c70-8ed7-32bf851be5ba","Type":"ContainerStarted","Data":"fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c"} Apr 24 22:18:28.998258 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:28.998207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" event={"ID":"151ccca9-3ef0-4c70-8ed7-32bf851be5ba","Type":"ContainerStarted","Data":"a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a"} Apr 24 22:18:28.998743 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:28.998310 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:29.018384 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:29.018341 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podStartSLOduration=4.018330426 podStartE2EDuration="4.018330426s" podCreationTimestamp="2026-04-24 22:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:18:29.016956754 +0000 UTC m=+3732.984758814" watchObservedRunningTime="2026-04-24 22:18:29.018330426 +0000 UTC m=+3732.986132487" Apr 24 22:18:30.001737 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:30.001707 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:30.003121 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:30.003091 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:18:31.005079 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:31.005037 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:18:36.008976 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:36.008946 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:18:36.009564 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:36.009536 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:18:46.009778 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:46.009720 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:18:56.010043 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:18:56.009957 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:19:06.009790 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:06.009737 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:19:16.009971 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:16.009928 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:19:26.009569 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:26.009530 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:19:36.009957 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:36.009920 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:19:45.653241 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:45.653210 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2"] Apr 24 22:19:45.653684 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:45.653527 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" containerID="cri-o://a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a" gracePeriod=30 Apr 24 22:19:45.653684 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:45.653577 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kube-rbac-proxy" containerID="cri-o://fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c" gracePeriod=30 Apr 24 22:19:46.006058 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.005984 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.64:8643/healthz\": dial tcp 10.134.0.64:8643: connect: connection refused" Apr 24 22:19:46.010410 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.010379 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:19:46.245511 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.245481 2578 generic.go:358] "Generic (PLEG): container finished" podID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerID="fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c" exitCode=2 Apr 24 22:19:46.245649 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.245523 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" event={"ID":"151ccca9-3ef0-4c70-8ed7-32bf851be5ba","Type":"ContainerDied","Data":"fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c"} Apr 24 22:19:46.660842 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.660805 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp"] Apr 24 22:19:46.661272 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.661259 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerName="storage-initializer" Apr 24 22:19:46.661397 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.661281 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerName="storage-initializer" Apr 24 22:19:46.661397 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.661310 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerName="storage-initializer" Apr 24 22:19:46.661397 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.661319 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerName="storage-initializer" Apr 24 22:19:46.661553 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.661438 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerName="storage-initializer" Apr 24 22:19:46.661607 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.661598 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b85d6f3-d103-43ba-9ce9-53fe8803ad67" containerName="storage-initializer" Apr 24 22:19:46.664536 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.664515 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.667042 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.667017 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:19:46.667149 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.667050 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 24 22:19:46.675608 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.675589 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp"] Apr 24 22:19:46.810672 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.810651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbkj8\" (UniqueName: \"kubernetes.io/projected/b6fd4750-6049-4209-9853-78972fdd23ca-kube-api-access-wbkj8\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.810810 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.810681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b6fd4750-6049-4209-9853-78972fdd23ca-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.810810 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.810704 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.810810 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.810735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6fd4750-6049-4209-9853-78972fdd23ca-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.911945 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.911882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6fd4750-6049-4209-9853-78972fdd23ca-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.912040 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.911957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbkj8\" (UniqueName: \"kubernetes.io/projected/b6fd4750-6049-4209-9853-78972fdd23ca-kube-api-access-wbkj8\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.912040 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.911976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b6fd4750-6049-4209-9853-78972fdd23ca-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.912040 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.911996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.912198 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:19:46.912174 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 22:19:46.912263 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.912245 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6fd4750-6049-4209-9853-78972fdd23ca-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.912263 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:19:46.912258 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls podName:b6fd4750-6049-4209-9853-78972fdd23ca nodeName:}" failed. No retries permitted until 2026-04-24 22:19:47.412236171 +0000 UTC m=+3811.380038212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" (UID: "b6fd4750-6049-4209-9853-78972fdd23ca") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 22:19:46.912560 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.912542 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b6fd4750-6049-4209-9853-78972fdd23ca-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:46.920956 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:46.920936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbkj8\" (UniqueName: \"kubernetes.io/projected/b6fd4750-6049-4209-9853-78972fdd23ca-kube-api-access-wbkj8\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:47.415031 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:47.414998 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:47.417463 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:47.417441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:47.575804 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:47.575774 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:47.691011 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:47.690945 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp"] Apr 24 22:19:47.694506 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:19:47.694454 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6fd4750_6049_4209_9853_78972fdd23ca.slice/crio-de7a0d3522e53c83b46a0912f23e289fabc4ea5c0a63ea76acb6b978ac1001ae WatchSource:0}: Error finding container de7a0d3522e53c83b46a0912f23e289fabc4ea5c0a63ea76acb6b978ac1001ae: Status 404 returned error can't find the container with id de7a0d3522e53c83b46a0912f23e289fabc4ea5c0a63ea76acb6b978ac1001ae Apr 24 22:19:47.696280 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:47.696258 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:19:48.252030 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:48.251993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" event={"ID":"b6fd4750-6049-4209-9853-78972fdd23ca","Type":"ContainerStarted","Data":"8912b79489c6dcac8543403e7223b918e4be3262269f94e571f6dc9fad842a49"} Apr 24 22:19:48.252030 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:48.252031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" event={"ID":"b6fd4750-6049-4209-9853-78972fdd23ca","Type":"ContainerStarted","Data":"de7a0d3522e53c83b46a0912f23e289fabc4ea5c0a63ea76acb6b978ac1001ae"} Apr 24 22:19:49.396187 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.396165 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:19:49.428527 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.428499 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " Apr 24 22:19:49.428691 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.428542 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ksgj\" (UniqueName: \"kubernetes.io/projected/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kube-api-access-2ksgj\") pod \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " Apr 24 22:19:49.428691 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.428625 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-proxy-tls\") pod \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " Apr 24 22:19:49.428691 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.428663 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kserve-provision-location\") pod \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " Apr 24 22:19:49.428894 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.428709 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-cabundle-cert\") pod \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\" (UID: \"151ccca9-3ef0-4c70-8ed7-32bf851be5ba\") " Apr 24 22:19:49.429011 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.428983 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "151ccca9-3ef0-4c70-8ed7-32bf851be5ba" (UID: "151ccca9-3ef0-4c70-8ed7-32bf851be5ba"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:19:49.429094 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.429034 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "151ccca9-3ef0-4c70-8ed7-32bf851be5ba" (UID: "151ccca9-3ef0-4c70-8ed7-32bf851be5ba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:19:49.429317 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.429294 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "151ccca9-3ef0-4c70-8ed7-32bf851be5ba" (UID: "151ccca9-3ef0-4c70-8ed7-32bf851be5ba"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:19:49.431186 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.431156 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "151ccca9-3ef0-4c70-8ed7-32bf851be5ba" (UID: "151ccca9-3ef0-4c70-8ed7-32bf851be5ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:19:49.431186 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.431169 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kube-api-access-2ksgj" (OuterVolumeSpecName: "kube-api-access-2ksgj") pod "151ccca9-3ef0-4c70-8ed7-32bf851be5ba" (UID: "151ccca9-3ef0-4c70-8ed7-32bf851be5ba"). InnerVolumeSpecName "kube-api-access-2ksgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:19:49.529860 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.529802 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:49.529860 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.529823 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ksgj\" (UniqueName: \"kubernetes.io/projected/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kube-api-access-2ksgj\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:49.529860 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.529834 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:49.529860 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.529843 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:49.529860 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:49.529851 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/151ccca9-3ef0-4c70-8ed7-32bf851be5ba-cabundle-cert\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:50.261115 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.261088 2578 generic.go:358] "Generic (PLEG): container finished" podID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerID="a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a" exitCode=0 Apr 24 22:19:50.261255 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.261121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" event={"ID":"151ccca9-3ef0-4c70-8ed7-32bf851be5ba","Type":"ContainerDied","Data":"a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a"} Apr 24 22:19:50.261255 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.261143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" event={"ID":"151ccca9-3ef0-4c70-8ed7-32bf851be5ba","Type":"ContainerDied","Data":"9c6b23afd43702e98e8540be4b571114cf407cd86ea026cb9b1d5f576161f2d4"} Apr 24 22:19:50.261255 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.261157 2578 scope.go:117] "RemoveContainer" containerID="fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c" Apr 24 22:19:50.261255 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.261161 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2" Apr 24 22:19:50.273135 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.272970 2578 scope.go:117] "RemoveContainer" containerID="a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a" Apr 24 22:19:50.280139 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.280121 2578 scope.go:117] "RemoveContainer" containerID="27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235" Apr 24 22:19:50.283332 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.283313 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2"] Apr 24 22:19:50.286986 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.286965 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-55fcc5fc5d-b7hz2"] Apr 24 22:19:50.288436 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.288423 2578 scope.go:117] "RemoveContainer" containerID="fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c" Apr 24 22:19:50.288664 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:19:50.288645 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c\": container with ID starting with fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c not found: ID does not exist" containerID="fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c" Apr 24 22:19:50.288737 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.288679 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c"} err="failed to get container status \"fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c\": rpc error: code = NotFound desc = could not find container \"fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c\": container with ID starting with fe0605b27bf7bf2df3bacb4d47c65b4151b7de72277b82127afc4b10ec45504c not found: ID does not exist" Apr 24 22:19:50.288737 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.288704 2578 scope.go:117] "RemoveContainer" containerID="a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a" Apr 24 22:19:50.289051 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:19:50.289034 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a\": container with ID starting with a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a not found: ID does not exist" containerID="a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a" Apr 24 22:19:50.289100 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.289057 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a"} err="failed to get container status \"a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a\": rpc error: code = NotFound desc = could not find container \"a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a\": container with ID starting with a8ed170e1a1b337da86d0c083dbea1813da2ece4be20486f7ad1a45574b19c2a not found: ID does not exist" Apr 24 22:19:50.289100 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.289073 2578 scope.go:117] "RemoveContainer" containerID="27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235" Apr 24 22:19:50.289274 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:19:50.289258 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235\": container with ID starting with 27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235 not found: ID does not exist" containerID="27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235" Apr 24 22:19:50.289324 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.289282 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235"} err="failed to get container status \"27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235\": rpc error: code = NotFound desc = could not find container \"27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235\": container with ID starting with 27688fb3873ee8cc2b3b8deb5492ab8bfd56305ce7e97325d0e69ea9da8d1235 not found: ID does not exist" Apr 24 22:19:50.659969 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:50.659941 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" path="/var/lib/kubelet/pods/151ccca9-3ef0-4c70-8ed7-32bf851be5ba/volumes" Apr 24 22:19:52.269936 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:52.269909 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_b6fd4750-6049-4209-9853-78972fdd23ca/storage-initializer/0.log" Apr 24 22:19:52.270300 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:52.269947 2578 generic.go:358] "Generic (PLEG): container finished" podID="b6fd4750-6049-4209-9853-78972fdd23ca" containerID="8912b79489c6dcac8543403e7223b918e4be3262269f94e571f6dc9fad842a49" exitCode=1 Apr 24 22:19:52.270300 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:52.270027 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" event={"ID":"b6fd4750-6049-4209-9853-78972fdd23ca","Type":"ContainerDied","Data":"8912b79489c6dcac8543403e7223b918e4be3262269f94e571f6dc9fad842a49"} Apr 24 22:19:53.274530 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:53.274500 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_b6fd4750-6049-4209-9853-78972fdd23ca/storage-initializer/0.log" Apr 24 22:19:53.274894 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:53.274557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" event={"ID":"b6fd4750-6049-4209-9853-78972fdd23ca","Type":"ContainerStarted","Data":"dce5ae642e636ab654b5e2751e662992e448e228b9fff076315e68ed376100b3"} Apr 24 22:19:55.281037 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:55.281014 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_b6fd4750-6049-4209-9853-78972fdd23ca/storage-initializer/1.log" Apr 24 22:19:55.281343 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:55.281290 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_b6fd4750-6049-4209-9853-78972fdd23ca/storage-initializer/0.log" Apr 24 22:19:55.281343 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:55.281317 2578 generic.go:358] "Generic (PLEG): container finished" podID="b6fd4750-6049-4209-9853-78972fdd23ca" containerID="dce5ae642e636ab654b5e2751e662992e448e228b9fff076315e68ed376100b3" exitCode=1 Apr 24 22:19:55.281422 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:55.281366 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" event={"ID":"b6fd4750-6049-4209-9853-78972fdd23ca","Type":"ContainerDied","Data":"dce5ae642e636ab654b5e2751e662992e448e228b9fff076315e68ed376100b3"} Apr 24 22:19:55.281422 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:55.281393 2578 scope.go:117] "RemoveContainer" containerID="8912b79489c6dcac8543403e7223b918e4be3262269f94e571f6dc9fad842a49" Apr 24 22:19:55.281792 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:55.281773 2578 scope.go:117] "RemoveContainer" containerID="8912b79489c6dcac8543403e7223b918e4be3262269f94e571f6dc9fad842a49" Apr 24 22:19:55.291258 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:19:55.291230 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_kserve-ci-e2e-test_b6fd4750-6049-4209-9853-78972fdd23ca_0 in pod sandbox de7a0d3522e53c83b46a0912f23e289fabc4ea5c0a63ea76acb6b978ac1001ae from index: no such id: '8912b79489c6dcac8543403e7223b918e4be3262269f94e571f6dc9fad842a49'" containerID="8912b79489c6dcac8543403e7223b918e4be3262269f94e571f6dc9fad842a49" Apr 24 22:19:55.291328 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:19:55.291272 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_kserve-ci-e2e-test_b6fd4750-6049-4209-9853-78972fdd23ca_0 in pod sandbox de7a0d3522e53c83b46a0912f23e289fabc4ea5c0a63ea76acb6b978ac1001ae from index: no such id: '8912b79489c6dcac8543403e7223b918e4be3262269f94e571f6dc9fad842a49'; Skipping pod \"isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_kserve-ci-e2e-test(b6fd4750-6049-4209-9853-78972fdd23ca)\"" logger="UnhandledError" Apr 24 22:19:55.292578 ip-10-0-134-248 kubenswrapper[2578]: E0424 22:19:55.292557 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_kserve-ci-e2e-test(b6fd4750-6049-4209-9853-78972fdd23ca)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" podUID="b6fd4750-6049-4209-9853-78972fdd23ca" Apr 24 22:19:56.285801 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.285775 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_b6fd4750-6049-4209-9853-78972fdd23ca/storage-initializer/1.log" Apr 24 22:19:56.646394 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.646322 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp"] Apr 24 22:19:56.765768 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.765732 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_b6fd4750-6049-4209-9853-78972fdd23ca/storage-initializer/1.log" Apr 24 22:19:56.765866 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.765823 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:56.781695 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.781671 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b6fd4750-6049-4209-9853-78972fdd23ca-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"b6fd4750-6049-4209-9853-78972fdd23ca\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " Apr 24 22:19:56.781783 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.781715 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls\") pod \"b6fd4750-6049-4209-9853-78972fdd23ca\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " Apr 24 22:19:56.781783 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.781743 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6fd4750-6049-4209-9853-78972fdd23ca-kserve-provision-location\") pod \"b6fd4750-6049-4209-9853-78972fdd23ca\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " Apr 24 22:19:56.781900 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.781810 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbkj8\" (UniqueName: \"kubernetes.io/projected/b6fd4750-6049-4209-9853-78972fdd23ca-kube-api-access-wbkj8\") pod \"b6fd4750-6049-4209-9853-78972fdd23ca\" (UID: \"b6fd4750-6049-4209-9853-78972fdd23ca\") " Apr 24 22:19:56.782052 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.782023 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6fd4750-6049-4209-9853-78972fdd23ca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b6fd4750-6049-4209-9853-78972fdd23ca" (UID: "b6fd4750-6049-4209-9853-78972fdd23ca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:19:56.782114 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.782068 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6fd4750-6049-4209-9853-78972fdd23ca-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "b6fd4750-6049-4209-9853-78972fdd23ca" (UID: "b6fd4750-6049-4209-9853-78972fdd23ca"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:19:56.783761 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.783730 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b6fd4750-6049-4209-9853-78972fdd23ca" (UID: "b6fd4750-6049-4209-9853-78972fdd23ca"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:19:56.783890 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.783869 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6fd4750-6049-4209-9853-78972fdd23ca-kube-api-access-wbkj8" (OuterVolumeSpecName: "kube-api-access-wbkj8") pod "b6fd4750-6049-4209-9853-78972fdd23ca" (UID: "b6fd4750-6049-4209-9853-78972fdd23ca"). InnerVolumeSpecName "kube-api-access-wbkj8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:19:56.882428 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.882408 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbkj8\" (UniqueName: \"kubernetes.io/projected/b6fd4750-6049-4209-9853-78972fdd23ca-kube-api-access-wbkj8\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:56.882428 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.882428 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b6fd4750-6049-4209-9853-78972fdd23ca-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:56.882559 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.882439 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6fd4750-6049-4209-9853-78972fdd23ca-proxy-tls\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:56.882559 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:56.882448 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6fd4750-6049-4209-9853-78972fdd23ca-kserve-provision-location\") on node \"ip-10-0-134-248.ec2.internal\" DevicePath \"\"" Apr 24 22:19:57.288927 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:57.288896 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp_b6fd4750-6049-4209-9853-78972fdd23ca/storage-initializer/1.log" Apr 24 22:19:57.289359 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:57.288990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" event={"ID":"b6fd4750-6049-4209-9853-78972fdd23ca","Type":"ContainerDied","Data":"de7a0d3522e53c83b46a0912f23e289fabc4ea5c0a63ea76acb6b978ac1001ae"} Apr 24 22:19:57.289359 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:57.289022 2578 scope.go:117] "RemoveContainer" containerID="dce5ae642e636ab654b5e2751e662992e448e228b9fff076315e68ed376100b3" Apr 24 22:19:57.289359 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:57.289040 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp" Apr 24 22:19:57.325645 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:57.325599 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp"] Apr 24 22:19:57.328121 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:57.328092 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-79bd57f944-7rmnp"] Apr 24 22:19:58.660039 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:19:58.660010 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6fd4750-6049-4209-9853-78972fdd23ca" path="/var/lib/kubelet/pods/b6fd4750-6049-4209-9853-78972fdd23ca/volumes" Apr 24 22:20:26.282947 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.282873 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2zhfh/must-gather-j8v9z"] Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283205 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283221 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283238 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6fd4750-6049-4209-9853-78972fdd23ca" containerName="storage-initializer" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283244 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd4750-6049-4209-9853-78972fdd23ca" containerName="storage-initializer" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283253 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kube-rbac-proxy" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283259 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kube-rbac-proxy" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283272 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6fd4750-6049-4209-9853-78972fdd23ca" containerName="storage-initializer" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283277 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd4750-6049-4209-9853-78972fdd23ca" containerName="storage-initializer" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283288 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="storage-initializer" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283293 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="storage-initializer" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283342 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kserve-container" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283355 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="151ccca9-3ef0-4c70-8ed7-32bf851be5ba" containerName="kube-rbac-proxy" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283362 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6fd4750-6049-4209-9853-78972fdd23ca" containerName="storage-initializer" Apr 24 22:20:26.283406 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.283370 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6fd4750-6049-4209-9853-78972fdd23ca" containerName="storage-initializer" Apr 24 22:20:26.286458 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.286440 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zhfh/must-gather-j8v9z" Apr 24 22:20:26.289018 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.288989 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2zhfh\"/\"default-dockercfg-fxrng\"" Apr 24 22:20:26.289164 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.289147 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2zhfh\"/\"openshift-service-ca.crt\"" Apr 24 22:20:26.289849 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.289834 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2zhfh\"/\"kube-root-ca.crt\"" Apr 24 22:20:26.296033 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.296009 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zhfh/must-gather-j8v9z"] Apr 24 22:20:26.387627 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.387602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grsg8\" (UniqueName: \"kubernetes.io/projected/0a884cb5-20d0-403c-a771-aa6c31681684-kube-api-access-grsg8\") pod \"must-gather-j8v9z\" (UID: \"0a884cb5-20d0-403c-a771-aa6c31681684\") " pod="openshift-must-gather-2zhfh/must-gather-j8v9z" Apr 24 22:20:26.387771 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.387648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a884cb5-20d0-403c-a771-aa6c31681684-must-gather-output\") pod \"must-gather-j8v9z\" (UID: \"0a884cb5-20d0-403c-a771-aa6c31681684\") " pod="openshift-must-gather-2zhfh/must-gather-j8v9z" Apr 24 22:20:26.488202 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.488168 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a884cb5-20d0-403c-a771-aa6c31681684-must-gather-output\") pod \"must-gather-j8v9z\" (UID: \"0a884cb5-20d0-403c-a771-aa6c31681684\") " pod="openshift-must-gather-2zhfh/must-gather-j8v9z" Apr 24 22:20:26.488365 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.488235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grsg8\" (UniqueName: \"kubernetes.io/projected/0a884cb5-20d0-403c-a771-aa6c31681684-kube-api-access-grsg8\") pod \"must-gather-j8v9z\" (UID: \"0a884cb5-20d0-403c-a771-aa6c31681684\") " pod="openshift-must-gather-2zhfh/must-gather-j8v9z" Apr 24 22:20:26.488569 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.488546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a884cb5-20d0-403c-a771-aa6c31681684-must-gather-output\") pod \"must-gather-j8v9z\" (UID: \"0a884cb5-20d0-403c-a771-aa6c31681684\") " pod="openshift-must-gather-2zhfh/must-gather-j8v9z" Apr 24 22:20:26.497196 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.497173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grsg8\" (UniqueName: \"kubernetes.io/projected/0a884cb5-20d0-403c-a771-aa6c31681684-kube-api-access-grsg8\") pod \"must-gather-j8v9z\" (UID: \"0a884cb5-20d0-403c-a771-aa6c31681684\") " pod="openshift-must-gather-2zhfh/must-gather-j8v9z" Apr 24 22:20:26.601042 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.600975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zhfh/must-gather-j8v9z" Apr 24 22:20:26.726273 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:26.726195 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zhfh/must-gather-j8v9z"] Apr 24 22:20:26.728464 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:20:26.728440 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a884cb5_20d0_403c_a771_aa6c31681684.slice/crio-c075abddb28a832e74e5d421a6530b8441797232ffa00b31c34587d63db64bec WatchSource:0}: Error finding container c075abddb28a832e74e5d421a6530b8441797232ffa00b31c34587d63db64bec: Status 404 returned error can't find the container with id c075abddb28a832e74e5d421a6530b8441797232ffa00b31c34587d63db64bec Apr 24 22:20:27.377128 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:27.377077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/must-gather-j8v9z" event={"ID":"0a884cb5-20d0-403c-a771-aa6c31681684","Type":"ContainerStarted","Data":"c075abddb28a832e74e5d421a6530b8441797232ffa00b31c34587d63db64bec"} Apr 24 22:20:28.383120 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:28.383083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/must-gather-j8v9z" event={"ID":"0a884cb5-20d0-403c-a771-aa6c31681684","Type":"ContainerStarted","Data":"a4211cebff8ceacd2821e085af649ba33f98cf50f2315bf2026dc39cec3a7dc6"} Apr 24 22:20:28.383631 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:28.383600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/must-gather-j8v9z" event={"ID":"0a884cb5-20d0-403c-a771-aa6c31681684","Type":"ContainerStarted","Data":"01beff0ee7504380cb2e2c22aae501e4819a8ca47b46d61a9708967a92b997d6"} Apr 24 22:20:28.401004 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:28.400942 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2zhfh/must-gather-j8v9z" podStartSLOduration=1.5325869650000001 podStartE2EDuration="2.400920945s" podCreationTimestamp="2026-04-24 22:20:26 +0000 UTC" firstStartedPulling="2026-04-24 22:20:26.730264687 +0000 UTC m=+3850.698066728" lastFinishedPulling="2026-04-24 22:20:27.598598668 +0000 UTC m=+3851.566400708" observedRunningTime="2026-04-24 22:20:28.398856884 +0000 UTC m=+3852.366658946" watchObservedRunningTime="2026-04-24 22:20:28.400920945 +0000 UTC m=+3852.368723028" Apr 24 22:20:29.073507 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:29.073460 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-44jqp_e2296a6c-7782-4207-85a9-4737aabd81e8/global-pull-secret-syncer/0.log" Apr 24 22:20:29.280311 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:29.280284 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kt72p_77098f84-e6ad-4ad0-b456-f4f82edf95bf/konnectivity-agent/0.log" Apr 24 22:20:29.482677 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:29.482638 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-248.ec2.internal_6c544ccdf1879496756152eb8f8b28eb/haproxy/0.log" Apr 24 22:20:31.907050 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:31.907018 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_86042033-8eca-4738-a4bf-f31cc898ce69/alertmanager/0.log" Apr 24 22:20:31.937241 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:31.937217 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_86042033-8eca-4738-a4bf-f31cc898ce69/config-reloader/0.log" Apr 24 22:20:31.969058 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:31.969020 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_86042033-8eca-4738-a4bf-f31cc898ce69/kube-rbac-proxy-web/0.log" Apr 24 22:20:31.999873 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:31.999829 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_86042033-8eca-4738-a4bf-f31cc898ce69/kube-rbac-proxy/0.log" Apr 24 22:20:32.033399 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.033373 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_86042033-8eca-4738-a4bf-f31cc898ce69/kube-rbac-proxy-metric/0.log" Apr 24 22:20:32.059045 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.058986 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_86042033-8eca-4738-a4bf-f31cc898ce69/prom-label-proxy/0.log" Apr 24 22:20:32.092998 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.092972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_86042033-8eca-4738-a4bf-f31cc898ce69/init-config-reloader/0.log" Apr 24 22:20:32.182838 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.182715 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqzx4_0a173019-029b-4950-854e-8165aa0b2dd9/kube-state-metrics/0.log" Apr 24 22:20:32.213369 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.213336 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqzx4_0a173019-029b-4950-854e-8165aa0b2dd9/kube-rbac-proxy-main/0.log" Apr 24 22:20:32.239742 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.239699 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqzx4_0a173019-029b-4950-854e-8165aa0b2dd9/kube-rbac-proxy-self/0.log" Apr 24 22:20:32.322237 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.322193 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-6bf8t_7f6ca405-4c66-464f-ab4b-de5695efac52/monitoring-plugin/0.log" Apr 24 22:20:32.465947 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.465359 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntgvc_e4a2d6c6-7f94-4021-965e-83df5466f932/node-exporter/0.log" Apr 24 22:20:32.492771 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.492234 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntgvc_e4a2d6c6-7f94-4021-965e-83df5466f932/kube-rbac-proxy/0.log" Apr 24 22:20:32.520302 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.520265 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntgvc_e4a2d6c6-7f94-4021-965e-83df5466f932/init-textfile/0.log" Apr 24 22:20:32.637618 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.637584 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2qk8_e6a7cb9f-8906-4db9-a2f9-ae926946111a/kube-rbac-proxy-main/0.log" Apr 24 22:20:32.659934 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.659905 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2qk8_e6a7cb9f-8906-4db9-a2f9-ae926946111a/kube-rbac-proxy-self/0.log" Apr 24 22:20:32.684397 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:32.684365 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2qk8_e6a7cb9f-8906-4db9-a2f9-ae926946111a/openshift-state-metrics/0.log" Apr 24 22:20:33.053863 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:33.053819 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8667f847-f6cb2_bb8263f1-b10d-461d-9d2e-d38fc7ff82f3/telemeter-client/0.log" Apr 24 22:20:33.079335 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:33.079308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8667f847-f6cb2_bb8263f1-b10d-461d-9d2e-d38fc7ff82f3/reload/0.log" Apr 24 22:20:33.106238 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:33.106212 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8667f847-f6cb2_bb8263f1-b10d-461d-9d2e-d38fc7ff82f3/kube-rbac-proxy/0.log" Apr 24 22:20:34.580236 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:34.580201 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-5t8j7_22fc467d-f2cb-40ba-8129-3562ce16391d/networking-console-plugin/0.log" Apr 24 22:20:35.447106 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:35.447082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67cd8c69c5-42vtl_c3a2bd38-885a-4011-aafa-732484642c2d/console/0.log" Apr 24 22:20:35.491114 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:35.491086 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-bh9lr_bda5e38a-75b7-4355-b330-717228aa7a75/download-server/0.log" Apr 24 22:20:35.876032 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:35.876001 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-pmdfz_549de488-07e9-4da0-aa3b-352b0762cb06/volume-data-source-validator/0.log" Apr 24 22:20:36.376906 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.376867 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj"] Apr 24 22:20:36.381504 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.381471 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.390160 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.390136 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj"] Apr 24 22:20:36.473728 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.473693 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-proc\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.473728 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.473736 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lm66\" (UniqueName: \"kubernetes.io/projected/a10c02b6-79ae-4401-ac0e-c95f267aaee2-kube-api-access-6lm66\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.473959 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.473826 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-podres\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.473959 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.473855 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-sys\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.473959 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.473906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-lib-modules\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574340 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lm66\" (UniqueName: \"kubernetes.io/projected/a10c02b6-79ae-4401-ac0e-c95f267aaee2-kube-api-access-6lm66\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574526 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-podres\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574526 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-sys\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574526 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-lib-modules\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574526 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-proc\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574776 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574532 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-sys\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574776 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574563 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-proc\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574776 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574570 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-podres\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.574776 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.574622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a10c02b6-79ae-4401-ac0e-c95f267aaee2-lib-modules\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.582000 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.581976 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6bsmb_33cc3935-f9cc-4484-ba91-4c3e16828c08/dns/0.log" Apr 24 22:20:36.582138 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.582123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lm66\" (UniqueName: \"kubernetes.io/projected/a10c02b6-79ae-4401-ac0e-c95f267aaee2-kube-api-access-6lm66\") pod \"perf-node-gather-daemonset-n6whj\" (UID: \"a10c02b6-79ae-4401-ac0e-c95f267aaee2\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.601380 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.601359 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6bsmb_33cc3935-f9cc-4484-ba91-4c3e16828c08/kube-rbac-proxy/0.log" Apr 24 22:20:36.662849 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.662821 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pvc6x_2a3c74ea-5d5a-4252-973d-273be9ad3ca5/dns-node-resolver/0.log" Apr 24 22:20:36.692255 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.692229 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:36.830741 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:36.830680 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj"] Apr 24 22:20:36.833534 ip-10-0-134-248 kubenswrapper[2578]: W0424 22:20:36.833505 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda10c02b6_79ae_4401_ac0e_c95f267aaee2.slice/crio-c5b0cd25590c0ee4e6122f35313a49dba42c4a47ebdce5173c9ed786f244e681 WatchSource:0}: Error finding container c5b0cd25590c0ee4e6122f35313a49dba42c4a47ebdce5173c9ed786f244e681: Status 404 returned error can't find the container with id c5b0cd25590c0ee4e6122f35313a49dba42c4a47ebdce5173c9ed786f244e681 Apr 24 22:20:37.206996 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:37.206967 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xtf2m_799c7f5a-9111-4e65-8973-f1d3fd28c13e/node-ca/0.log" Apr 24 22:20:37.420475 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:37.420440 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" event={"ID":"a10c02b6-79ae-4401-ac0e-c95f267aaee2","Type":"ContainerStarted","Data":"ae02e868eb20c952c14a35c692c7ea8885f84f77d101a426550edf40224bf070"} Apr 24 22:20:37.420475 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:37.420482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" event={"ID":"a10c02b6-79ae-4401-ac0e-c95f267aaee2","Type":"ContainerStarted","Data":"c5b0cd25590c0ee4e6122f35313a49dba42c4a47ebdce5173c9ed786f244e681"} Apr 24 22:20:37.420682 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:37.420513 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:37.435999 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:37.435946 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" podStartSLOduration=1.435931043 podStartE2EDuration="1.435931043s" podCreationTimestamp="2026-04-24 22:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:20:37.43432967 +0000 UTC m=+3861.402131732" watchObservedRunningTime="2026-04-24 22:20:37.435931043 +0000 UTC m=+3861.403733105" Apr 24 22:20:37.885398 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:37.885366 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d76d9f5d-zjqpm_98762743-4e7e-44a9-878c-f6893dcf9c44/router/0.log" Apr 24 22:20:38.188604 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:38.188577 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-867cc_5a97d756-3ff0-4986-bf6f-582a917fdc0a/serve-healthcheck-canary/0.log" Apr 24 22:20:38.658487 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:38.658447 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ccjvm_a3a9aad5-ac45-4187-9590-f87a0a2157cd/kube-rbac-proxy/0.log" Apr 24 22:20:38.680265 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:38.680236 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ccjvm_a3a9aad5-ac45-4187-9590-f87a0a2157cd/exporter/0.log" Apr 24 22:20:38.703051 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:38.703028 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ccjvm_a3a9aad5-ac45-4187-9590-f87a0a2157cd/extractor/0.log" Apr 24 22:20:40.696083 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:40.696044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-74fc8f6f96-wh42d_26ad0017-0c4c-4b7f-b789-c3f268e9a34b/manager/0.log" Apr 24 22:20:40.715690 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:40.715660 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-rq55d_06fbf7df-838a-4579-9dbf-44959637e896/manager/0.log" Apr 24 22:20:43.434976 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:43.434940 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-n6whj" Apr 24 22:20:45.874798 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:45.874770 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-84bkj_9ec1111b-6a43-49dd-978e-ad82b438f091/kube-multus-additional-cni-plugins/0.log" Apr 24 22:20:45.902201 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:45.902172 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-84bkj_9ec1111b-6a43-49dd-978e-ad82b438f091/egress-router-binary-copy/0.log" Apr 24 22:20:45.922922 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:45.922883 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-84bkj_9ec1111b-6a43-49dd-978e-ad82b438f091/cni-plugins/0.log" Apr 24 22:20:45.948018 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:45.947881 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-84bkj_9ec1111b-6a43-49dd-978e-ad82b438f091/bond-cni-plugin/0.log" Apr 24 22:20:45.970857 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:45.970833 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-84bkj_9ec1111b-6a43-49dd-978e-ad82b438f091/routeoverride-cni/0.log" Apr 24 22:20:45.991018 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:45.990996 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-84bkj_9ec1111b-6a43-49dd-978e-ad82b438f091/whereabouts-cni-bincopy/0.log" Apr 24 22:20:46.013628 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:46.013609 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-84bkj_9ec1111b-6a43-49dd-978e-ad82b438f091/whereabouts-cni/0.log" Apr 24 22:20:46.388864 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:46.388839 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vvqbk_c6a36d68-e5f6-4ff5-8bbd-95e656f22006/kube-multus/0.log" Apr 24 22:20:46.411618 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:46.411597 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bcqjb_371c1fec-a68a-4ff5-b5fc-29a34feb3ffe/network-metrics-daemon/0.log" Apr 24 22:20:46.431521 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:46.431487 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bcqjb_371c1fec-a68a-4ff5-b5fc-29a34feb3ffe/kube-rbac-proxy/0.log" Apr 24 22:20:47.263918 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.263887 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-controller/0.log" Apr 24 22:20:47.345226 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.345196 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/0.log" Apr 24 22:20:47.363451 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.363430 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovn-acl-logging/1.log" Apr 24 22:20:47.391504 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.391485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/kube-rbac-proxy-node/0.log" Apr 24 22:20:47.417169 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.417143 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:20:47.435693 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.435672 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/northd/0.log" Apr 24 22:20:47.455662 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.455641 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/nbdb/0.log" Apr 24 22:20:47.475953 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.475921 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/sbdb/0.log" Apr 24 22:20:47.596655 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:47.596624 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-49kt7_e70e5f9c-8c1a-4ad0-b8e0-9f7176780519/ovnkube-controller/0.log" Apr 24 22:20:49.124491 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:49.124453 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vrf9q_6fda3b6d-a4e2-4aa3-b140-9768563e5f02/network-check-target-container/0.log" Apr 24 22:20:50.049026 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:50.048992 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8qkh9_034136db-21a3-428a-894b-90395491da10/iptables-alerter/0.log" Apr 24 22:20:50.730091 ip-10-0-134-248 kubenswrapper[2578]: I0424 22:20:50.730061 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5dm7j_44e23ee3-f057-4ec2-bc73-ccfb6c251e9c/tuned/0.log"