Apr 23 13:29:09.723122 ip-10-0-139-40 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 13:29:09.723137 ip-10-0-139-40 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 13:29:09.723147 ip-10-0-139-40 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 13:29:09.723478 ip-10-0-139-40 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 13:29:21.063110 ip-10-0-139-40 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 13:29:21.063125 ip-10-0-139-40 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 521e965e5a0b49d98e909f4e2dbd23f9 -- Apr 23 13:31:36.866286 ip-10-0-139-40 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:31:37.395997 ip-10-0-139-40 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:37.395997 ip-10-0-139-40 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:31:37.395997 ip-10-0-139-40 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:37.395997 ip-10-0-139-40 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:31:37.395997 ip-10-0-139-40 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:37.397886 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.397801 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:31:37.403420 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403381 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:37.403420 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403413 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:37.403420 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403422 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:37.403420 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403427 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403435 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403440 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403445 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403450 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403455 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403459 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403463 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403468 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403479 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403484 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403490 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403496 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403500 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403504 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403508 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403512 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403516 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403520 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:37.403663 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403524 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403529 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403538 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403542 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403546 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403550 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403554 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403559 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403563 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403567 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403571 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403590 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403595 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403599 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403650 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403677 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403800 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403810 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403815 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403820 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:37.404138 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403826 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403830 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403835 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403842 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403848 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403852 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403857 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403861 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403865 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403869 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403873 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403878 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403882 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403886 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403890 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403893 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403897 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403901 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403905 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403909 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:37.404654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403930 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403935 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403940 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403944 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403948 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403952 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403956 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403962 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403966 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403970 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403975 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403979 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403987 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403992 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403995 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.403998 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404001 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404004 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404007 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404009 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:37.405166 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404012 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404014 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404017 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404019 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404436 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404443 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404446 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404449 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404452 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404454 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404458 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404460 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404463 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404465 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404467 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404470 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404473 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404476 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404478 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404482 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:37.405644 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404484 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404488 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404490 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404493 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404496 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404498 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404500 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404503 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404506 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404508 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404511 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404513 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404516 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404518 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404521 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404523 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404526 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404529 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404531 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404533 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:37.406150 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404536 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404538 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404542 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404545 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404548 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404551 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404553 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404556 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404558 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404561 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404563 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404566 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404569 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404571 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404574 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404576 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404579 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404581 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404584 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:37.406647 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404586 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404589 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404592 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404594 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404597 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404599 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404601 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404604 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404606 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404609 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404611 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404614 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404616 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404618 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404621 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404623 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404626 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404628 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404630 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404633 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:37.407181 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404635 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404640 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404643 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404645 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404648 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404651 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404653 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404656 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404659 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404662 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.404664 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404743 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404753 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404762 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404768 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404775 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404781 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404788 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404793 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404796 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404799 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:31:37.407654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404803 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404806 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404809 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404812 2582 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404814 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404818 2582 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404820 2582 flags.go:64] FLAG: --cloud-config="" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404823 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404826 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404831 2582 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404834 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404837 2582 flags.go:64] FLAG: --config-dir="" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404840 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404843 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404847 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404851 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404855 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404858 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404861 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404864 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404867 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404870 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404872 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404877 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404880 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:31:37.408180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404883 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404885 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404889 2582 flags.go:64] FLAG: --enable-server="true" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404892 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404896 2582 flags.go:64] FLAG: --event-burst="100" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404900 2582 flags.go:64] FLAG: --event-qps="50" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404903 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404906 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404909 2582 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404937 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404941 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404944 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404947 2582 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404950 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404953 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404955 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404958 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404961 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404964 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404967 2582 flags.go:64] FLAG: --feature-gates="" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404971 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404973 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404977 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404980 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404983 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:31:37.408784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404986 2582 flags.go:64] FLAG: --help="false" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404988 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404991 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404994 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.404997 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405001 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405004 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405007 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405009 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405012 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405015 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405019 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405022 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405025 2582 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405027 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405030 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405033 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405035 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405038 2582 flags.go:64] FLAG: --lock-file="" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405041 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405044 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405047 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405052 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:31:37.409402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405055 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405057 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405060 2582 flags.go:64] FLAG: --logging-format="text" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405063 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405066 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405069 2582 flags.go:64] FLAG: --manifest-url="" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405072 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405077 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405079 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405083 2582 flags.go:64] FLAG: --max-pods="110" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405086 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405089 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405092 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405095 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405098 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405101 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405103 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405111 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405114 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405117 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405120 2582 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405123 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405129 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405131 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:31:37.409968 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405134 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405137 2582 flags.go:64] FLAG: --port="10250" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405140 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405143 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0112a689c7556e062" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405146 2582 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405149 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405152 2582 flags.go:64] FLAG: --register-node="true" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405155 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405157 2582 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405161 2582 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405163 2582 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405166 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405169 2582 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405173 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405177 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405180 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405183 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405186 2582 flags.go:64] FLAG: --runonce="false" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405189 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405192 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405195 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405198 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405201 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405204 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405207 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405210 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:31:37.410545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405213 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405215 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405218 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405225 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405229 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405232 2582 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405235 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405240 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405243 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405245 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405250 2582 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405253 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405255 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405258 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405261 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405264 2582 flags.go:64] FLAG: --v="2" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405269 2582 flags.go:64] FLAG: --version="false" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405273 2582 flags.go:64] FLAG: --vmodule="" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405277 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405280 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405377 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405381 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405384 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405387 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:37.411179 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405389 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405392 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405394 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405397 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405399 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405402 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405405 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405407 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405410 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405412 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405415 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405418 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405424 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405426 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405429 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405431 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405434 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405438 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405441 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:37.411781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405444 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405447 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405450 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405452 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405455 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405458 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405461 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405463 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405467 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405473 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405475 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405478 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405480 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405483 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405486 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405488 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405491 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405493 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405496 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405498 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:37.412318 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405501 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405503 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405506 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405508 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405511 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405513 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405516 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405518 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405521 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405523 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405526 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405528 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405530 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405533 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405535 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405538 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405540 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405543 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405545 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:37.412790 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405548 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405555 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405558 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405561 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405563 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405566 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405568 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405571 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405573 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405576 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405578 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405580 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405583 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405585 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405588 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405590 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405593 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405596 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405598 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405600 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:37.413276 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405603 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405606 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405608 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.405610 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.405616 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.413150 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.413165 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413210 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413215 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413219 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413222 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413225 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413228 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413231 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413234 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413237 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:37.413765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413240 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413243 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413245 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413248 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413251 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413253 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413255 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413258 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413261 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413263 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413266 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413268 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413270 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413273 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413275 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413278 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413280 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413283 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413286 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:37.414190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413288 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413291 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413293 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413297 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413300 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413303 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413305 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413308 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413310 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413312 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413315 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413318 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413320 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413322 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413325 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413327 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413329 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413332 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413334 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413338 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:37.414646 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413343 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413346 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413348 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413351 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413353 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413356 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413359 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413361 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413364 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413366 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413369 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413371 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413374 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413377 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413380 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413383 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413387 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413389 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413392 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413394 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:37.415155 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413397 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413399 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413402 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413404 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413407 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413409 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413411 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413414 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413416 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413419 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413421 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413424 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413427 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413429 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413431 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413434 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413436 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:37.415623 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413439 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.413444 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413572 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413577 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413581 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413583 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413586 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413589 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413591 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413594 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413597 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413600 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413603 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413606 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413608 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:37.416088 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413611 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413613 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413616 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413618 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413621 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413625 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413628 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413631 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413634 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413636 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413638 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413641 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413643 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413646 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413648 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413651 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413653 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413655 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413658 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413661 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:37.416458 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413663 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413666 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413668 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413671 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413674 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413677 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413680 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413684 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413687 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413689 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413693 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413695 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413698 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413700 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413703 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413705 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413708 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413710 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413712 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:37.416954 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413715 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413717 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413719 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413722 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413724 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413727 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413729 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413732 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413734 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413737 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413739 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413741 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413744 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413746 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413749 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413751 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413753 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413756 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413758 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413761 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:37.417451 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413764 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413766 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413769 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413772 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413775 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413778 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413780 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413783 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413785 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413788 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413790 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413792 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413795 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:37.413797 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.413801 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:37.418000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.414683 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:31:37.418363 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.418008 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:31:37.419084 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.419073 2582 server.go:1019] "Starting client certificate rotation" Apr 23 13:31:37.419183 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.419167 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:37.419221 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.419202 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:37.448798 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.448782 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:37.451518 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.451502 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:37.466665 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.466649 2582 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:31:37.473530 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.473516 2582 log.go:25] "Validated CRI v1 image API" Apr 23 13:31:37.474851 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.474836 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:31:37.477814 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.477784 2582 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 926819b2-8ee1-4c09-a8d5-a53d1d02a53e:/dev/nvme0n1p4 f4cbeb4e-5321-47c9-9d7b-37d290c9712a:/dev/nvme0n1p3] Apr 23 13:31:37.477897 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.477812 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:31:37.477972 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.477936 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:37.483085 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.482977 2582 manager.go:217] Machine: {Timestamp:2026-04-23 13:31:37.481654 +0000 UTC m=+0.468894870 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098484 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec229255b6abf53010bc9d0290d58a37 SystemUUID:ec229255-b6ab-f530-10bc-9d0290d58a37 BootID:521e965e-5a0b-49d9-8e90-9f4e2dbd23f9 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d9:0f:39:f7:41 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d9:0f:39:f7:41 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:a2:26:bd:13:a5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:31:37.483085 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.483074 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:31:37.483258 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.483186 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:31:37.484536 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.484506 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:31:37.484712 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.484539 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-40.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:31:37.484795 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.484725 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:31:37.484795 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.484738 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:31:37.484795 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.484756 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:37.486041 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.486028 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:37.487958 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.487946 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:37.488086 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.488075 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:31:37.490911 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.490900 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:31:37.490996 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.490933 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:31:37.490996 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.490950 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:31:37.490996 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.490967 2582 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:31:37.490996 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.490979 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:31:37.492160 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.492147 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:37.492221 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.492171 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:37.495433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.495413 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:31:37.495510 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.495440 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l27kh" Apr 23 13:31:37.497232 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.497219 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:31:37.499409 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499398 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:31:37.499451 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499415 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:31:37.499451 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499424 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:31:37.499451 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499433 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:31:37.499451 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499441 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:31:37.499451 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499449 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:31:37.499620 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499455 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:31:37.499620 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499460 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:31:37.499620 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499468 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:31:37.499620 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499473 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:31:37.499620 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499482 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:31:37.499620 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.499491 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:31:37.500546 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.500535 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:31:37.500583 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.500548 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:31:37.503298 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.503275 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:31:37.503298 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.503291 2582 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-40.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:31:37.503429 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.503333 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l27kh" Apr 23 13:31:37.503482 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.503456 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-40.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:31:37.504676 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.504663 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:31:37.504709 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.504700 2582 server.go:1295] "Started kubelet" Apr 23 13:31:37.504814 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.504788 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:31:37.504863 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.504814 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:31:37.505069 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.505049 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:31:37.505552 ip-10-0-139-40 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:31:37.507259 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.507234 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:31:37.508147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.508132 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:31:37.514277 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.514258 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:37.518994 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.518974 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:31:37.520297 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.520279 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:31:37.520382 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.520300 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:31:37.520696 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.520672 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:31:37.520891 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.520877 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:31:37.520983 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.520902 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:31:37.521120 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.521059 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:37.521565 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.521507 2582 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:31:37.522191 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.522175 2582 factory.go:153] Registering CRI-O factory Apr 23 13:31:37.522285 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.522195 2582 factory.go:223] Registration of the crio container factory successfully Apr 23 13:31:37.522285 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.522260 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:31:37.522285 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.522270 2582 factory.go:55] Registering systemd factory Apr 23 13:31:37.522285 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.522278 2582 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:31:37.522450 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.522327 2582 factory.go:103] Registering Raw factory Apr 23 13:31:37.522531 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.522519 2582 manager.go:1196] Started watching for new ooms in manager Apr 23 13:31:37.524022 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.523996 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:37.524231 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.524209 2582 manager.go:319] Starting recovery of all containers Apr 23 13:31:37.528320 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.528293 2582 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-40.ec2.internal\" not found" node="ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.531623 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.531491 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:31:37.535412 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.535395 2582 manager.go:324] Recovery completed Apr 23 13:31:37.539546 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.539532 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:37.542104 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.542090 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:37.542163 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.542116 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:37.542163 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.542127 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:37.542604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.542592 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:31:37.542604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.542602 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:31:37.542691 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.542618 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:37.545408 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.545397 2582 policy_none.go:49] "None policy: Start" Apr 23 13:31:37.545444 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.545413 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:31:37.545444 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.545422 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:31:37.587472 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.587450 2582 manager.go:341] "Starting Device Plugin manager" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.587479 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.587489 2582 server.go:85] "Starting device plugin registration server" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.587723 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.587735 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.587838 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.587911 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.587937 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.588375 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:31:37.593265 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.588404 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:37.665136 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.665064 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:31:37.665136 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.665107 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:31:37.665136 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.665129 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:31:37.665136 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.665139 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:31:37.665339 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.665177 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:31:37.667423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.667407 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:37.688183 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.688163 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:37.689001 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.688985 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:37.689071 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.689014 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:37.689071 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.689027 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:37.689071 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.689051 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.700008 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.699987 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.700093 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.700012 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-40.ec2.internal\": node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:37.714053 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.714034 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:37.765511 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.765476 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal"] Apr 23 13:31:37.765616 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.765582 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:37.766430 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.766408 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:37.766535 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.766441 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:37.766535 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.766455 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:37.767550 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.767536 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:37.767661 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.767647 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.767699 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.767676 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:37.768219 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.768205 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:37.768296 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.768216 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:37.768296 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.768235 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:37.768296 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.768239 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:37.768296 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.768247 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:37.768296 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.768251 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:37.769435 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.769417 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.769489 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.769447 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:37.770139 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.770112 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:37.770139 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.770139 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:37.770229 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.770150 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:37.795464 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.795437 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-40.ec2.internal\" not found" node="ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.799406 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.799389 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-40.ec2.internal\" not found" node="ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.814949 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.814933 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:37.823358 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.823340 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6662ba3d19b8f35202b9c72562560f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal\" (UID: \"e6662ba3d19b8f35202b9c72562560f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.823406 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.823366 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/561727a35e04946faef12be860e97824-config\") pod \"kube-apiserver-proxy-ip-10-0-139-40.ec2.internal\" (UID: \"561727a35e04946faef12be860e97824\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.823406 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.823382 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6662ba3d19b8f35202b9c72562560f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal\" (UID: \"e6662ba3d19b8f35202b9c72562560f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.915416 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:37.915334 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:37.924216 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.924187 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6662ba3d19b8f35202b9c72562560f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal\" (UID: \"e6662ba3d19b8f35202b9c72562560f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.924299 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.924233 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6662ba3d19b8f35202b9c72562560f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal\" (UID: \"e6662ba3d19b8f35202b9c72562560f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.924299 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.924285 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6662ba3d19b8f35202b9c72562560f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal\" (UID: \"e6662ba3d19b8f35202b9c72562560f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.924366 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.924310 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/561727a35e04946faef12be860e97824-config\") pod \"kube-apiserver-proxy-ip-10-0-139-40.ec2.internal\" (UID: \"561727a35e04946faef12be860e97824\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.924366 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.924360 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/561727a35e04946faef12be860e97824-config\") pod \"kube-apiserver-proxy-ip-10-0-139-40.ec2.internal\" (UID: \"561727a35e04946faef12be860e97824\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" Apr 23 13:31:37.924425 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:37.924375 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6662ba3d19b8f35202b9c72562560f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal\" (UID: \"e6662ba3d19b8f35202b9c72562560f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:38.015611 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.015557 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:38.097101 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.097066 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:38.101549 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.101532 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" Apr 23 13:31:38.115978 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.115955 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:38.216607 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.216530 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:38.317088 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.317065 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-40.ec2.internal\" not found" Apr 23 13:31:38.411770 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.411744 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:38.413770 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.413752 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:38.418932 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.418903 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:31:38.419055 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.419035 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:38.419055 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.419039 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:38.419135 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.419052 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:38.419135 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.419041 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:38.420045 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.420033 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" Apr 23 13:31:38.440052 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.440034 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:38.440825 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.440813 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" Apr 23 13:31:38.449748 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.449732 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:38.491958 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.491872 2582 apiserver.go:52] "Watching apiserver" Apr 23 13:31:38.500050 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.500030 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:31:38.502761 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.502739 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-r2klv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6","openshift-image-registry/node-ca-kz76p","openshift-multus/multus-dm595","openshift-multus/network-metrics-daemon-8vwqm","openshift-ovn-kubernetes/ovnkube-node-cx2lr","kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal","openshift-cluster-node-tuning-operator/tuned-9ss9q","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal","openshift-multus/multus-additional-cni-plugins-vdtkd","openshift-network-diagnostics/network-check-target-dnqkh","openshift-network-operator/iptables-alerter-rgtnc"] Apr 23 13:31:38.504536 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.504518 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:38.505390 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.505362 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:26:37 +0000 UTC" deadline="2027-10-10 05:43:39.358471088 +0000 UTC" Apr 23 13:31:38.505464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.505390 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12832h12m0.853084209s" Apr 23 13:31:38.505617 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.505603 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.506755 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.506735 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:31:38.506755 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.506751 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:31:38.506943 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.506778 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2frzk\"" Apr 23 13:31:38.507547 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.507531 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.507996 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.507971 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-d7plh\"" Apr 23 13:31:38.508093 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.507996 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:31:38.508093 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.508002 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:31:38.508208 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.508088 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:31:38.508571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.508549 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dm595" Apr 23 13:31:38.509713 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.509668 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:38.509805 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.509742 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:31:38.509805 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.509787 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:38.509937 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.509807 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:31:38.509937 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.509854 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.510666 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.510653 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:31:38.510666 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.510665 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5h8nj\"" Apr 23 13:31:38.510767 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.510759 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:31:38.511057 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.511001 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.512697 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.511404 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:31:38.512697 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.511474 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:31:38.512697 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.511893 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:31:38.512697 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.512520 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5h8nb\"" Apr 23 13:31:38.512940 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.512834 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:31:38.512940 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.512860 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9whf8\"" Apr 23 13:31:38.513027 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.513000 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:31:38.513137 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.513120 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:31:38.514571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.514320 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:38.514571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.514378 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:31:38.514571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.514389 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:31:38.514571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.514402 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:38.514571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.514410 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:38.514571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.514425 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-prq5s\"" Apr 23 13:31:38.514571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.514436 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:31:38.514869 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.514679 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.515889 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.515873 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:38.515979 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.515959 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:38.516821 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.516806 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:31:38.516952 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.516886 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pzgzn\"" Apr 23 13:31:38.516952 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.516895 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:31:38.517122 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.517108 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.519516 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.519468 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:38.519879 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.519863 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:38.520165 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.520148 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xc524\"" Apr 23 13:31:38.520250 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.520149 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:31:38.521416 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.521399 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:31:38.525059 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.525043 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:38.527539 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527517 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-etc-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.527637 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527555 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-ovnkube-config\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.527637 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527582 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-env-overrides\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.527637 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527628 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b96ebd94-9b73-4821-8946-4734e772932d-serviceca\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527654 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zqrd\" (UniqueName: \"kubernetes.io/projected/b96ebd94-9b73-4821-8946-4734e772932d-kube-api-access-2zqrd\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527680 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-systemd\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527705 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hql8q\" (UniqueName: \"kubernetes.io/projected/431c9349-7f7f-4d46-8b03-2517188be63c-kube-api-access-hql8q\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527727 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-log-socket\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527748 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527772 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-var-lib-kubelet\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527794 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/66ef4cab-5277-4b3a-a87a-8cc03965a437-konnectivity-ca\") pod \"konnectivity-agent-r2klv\" (UID: \"66ef4cab-5277-4b3a-a87a-8cc03965a437\") " pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527817 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-sys-fs\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527840 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-cnibin\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.527864 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527863 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30d85b9d-16ae-419a-8534-8b142607909e-ovn-node-metrics-cert\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527890 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-systemd\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527933 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d70fbe1-9754-48a1-82b6-1656723cda25-tmp\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527959 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzkxg\" (UniqueName: \"kubernetes.io/projected/6d70fbe1-9754-48a1-82b6-1656723cda25-kube-api-access-hzkxg\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.527983 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-cni-bin\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528008 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdkx4\" (UniqueName: \"kubernetes.io/projected/ceb05455-1014-4f0a-bf0c-498593be94d6-kube-api-access-mdkx4\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528031 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b96ebd94-9b73-4821-8946-4734e772932d-host\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528066 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-cnibin\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528103 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-cni-multus\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528139 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-var-lib-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528179 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-run-ovn-kubernetes\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528200 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-ovnkube-script-lib\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528218 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-os-release\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528233 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzl4f\" (UniqueName: \"kubernetes.io/projected/1f7b9e0a-9c75-402a-9f74-7dc83741af82-kube-api-access-kzl4f\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528246 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f7b9e0a-9c75-402a-9f74-7dc83741af82-cni-binary-copy\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528266 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-socket-dir-parent\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528281 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysctl-conf\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528309 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-conf-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528342 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-system-cni-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528359 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-hostroot\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528373 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvm4f\" (UniqueName: \"kubernetes.io/projected/bf879d65-39bb-4d9a-aa57-7d499026e167-kube-api-access-fvm4f\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:38.528392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528388 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-system-cni-dir\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528409 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-os-release\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528439 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a0b8e16-337d-4350-8a10-754ae14c0ea7-host-slash\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528467 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528489 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528509 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-cni-binary-copy\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528531 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-sys\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528562 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-tuned\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528589 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528608 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-systemd-units\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528623 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528636 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-node-log\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528656 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-device-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528693 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-netns\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528728 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-kubelet\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528751 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-cni-netd\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.528833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528791 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528819 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-modprobe-d\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528851 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysconfig\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528874 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-run\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528893 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528933 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-daemon-config\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528952 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-etc-kubernetes\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528970 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-slash\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.528992 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-lib-modules\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529010 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-host\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529030 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529054 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1a0b8e16-337d-4350-8a10-754ae14c0ea7-iptables-alerter-script\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529080 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hb7s\" (UniqueName: \"kubernetes.io/projected/30d85b9d-16ae-419a-8534-8b142607909e-kube-api-access-9hb7s\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529107 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-cni-bin\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529138 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-kubelet\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529161 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-ovn\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.529343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529179 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-kubernetes\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529195 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/66ef4cab-5277-4b3a-a87a-8cc03965a437-agent-certs\") pod \"konnectivity-agent-r2klv\" (UID: \"66ef4cab-5277-4b3a-a87a-8cc03965a437\") " pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529212 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-registration-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529237 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-multus-certs\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529254 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-run-netns\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529271 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529292 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysctl-d\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529324 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxbz\" (UniqueName: \"kubernetes.io/projected/1a0b8e16-337d-4350-8a10-754ae14c0ea7-kube-api-access-ctxbz\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529344 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-socket-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529361 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-cni-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.529771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.529395 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-k8s-cni-cncf-io\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.542280 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.542254 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kpvrb" Apr 23 13:31:38.550371 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.550355 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kpvrb" Apr 23 13:31:38.631146 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631118 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b96ebd94-9b73-4821-8946-4734e772932d-host\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631152 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-cnibin\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631174 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-cni-multus\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631199 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-var-lib-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631223 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-run-ovn-kubernetes\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631246 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-ovnkube-script-lib\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631261 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-var-lib-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631242 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b96ebd94-9b73-4821-8946-4734e772932d-host\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631267 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-cnibin\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631263 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-cni-multus\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631278 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-run-ovn-kubernetes\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631271 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-os-release\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631388 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzl4f\" (UniqueName: \"kubernetes.io/projected/1f7b9e0a-9c75-402a-9f74-7dc83741af82-kube-api-access-kzl4f\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631420 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f7b9e0a-9c75-402a-9f74-7dc83741af82-cni-binary-copy\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631445 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-socket-dir-parent\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631469 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysctl-conf\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631495 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-conf-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631516 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-system-cni-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631539 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-hostroot\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631562 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvm4f\" (UniqueName: \"kubernetes.io/projected/bf879d65-39bb-4d9a-aa57-7d499026e167-kube-api-access-fvm4f\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631584 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-system-cni-dir\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631607 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-os-release\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631631 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a0b8e16-337d-4350-8a10-754ae14c0ea7-host-slash\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631627 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-conf-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631654 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631677 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631682 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-socket-dir-parent\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.631797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631700 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-cni-binary-copy\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631725 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-sys\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631748 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysctl-conf\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631754 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-os-release\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631446 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-os-release\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631796 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a0b8e16-337d-4350-8a10-754ae14c0ea7-host-slash\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631822 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-sys\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631844 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-system-cni-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631750 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-tuned\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631874 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-ovnkube-script-lib\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631860 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-system-cni-dir\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631888 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631957 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-systemd-units\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.631985 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632010 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-node-log\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632013 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632023 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f7b9e0a-9c75-402a-9f74-7dc83741af82-cni-binary-copy\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.631985 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:38.632604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632048 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-device-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632062 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-hostroot\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632076 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-netns\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632078 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632108 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-netns\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632111 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.632121 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs podName:bf879d65-39bb-4d9a-aa57-7d499026e167 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.132088196 +0000 UTC m=+2.119329071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs") pod "network-metrics-daemon-8vwqm" (UID: "bf879d65-39bb-4d9a-aa57-7d499026e167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632148 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-node-log\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632158 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-kubelet\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632164 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-device-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632183 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-cni-netd\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632189 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-systemd-units\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632204 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-kubelet\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632212 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632234 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-cni-netd\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632239 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-modprobe-d\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632262 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysconfig\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.633467 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632270 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-cni-binary-copy\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632286 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-run\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632309 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632326 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysconfig\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632330 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-modprobe-d\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632333 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-daemon-config\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632358 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-etc-kubernetes\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632366 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632375 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-run\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632384 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-slash\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632425 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-etc-kubernetes\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632437 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-slash\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632457 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-lib-modules\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632483 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-host\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632473 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632511 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632556 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1a0b8e16-337d-4350-8a10-754ae14c0ea7-iptables-alerter-script\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632561 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-host\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632568 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-lib-modules\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632581 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hb7s\" (UniqueName: \"kubernetes.io/projected/30d85b9d-16ae-419a-8534-8b142607909e-kube-api-access-9hb7s\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632606 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-cni-bin\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632622 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632673 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-kubelet\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632628 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-kubelet\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632709 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-ovn\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632732 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-kubernetes\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632737 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-var-lib-cni-bin\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632758 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/66ef4cab-5277-4b3a-a87a-8cc03965a437-agent-certs\") pod \"konnectivity-agent-r2klv\" (UID: \"66ef4cab-5277-4b3a-a87a-8cc03965a437\") " pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632781 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-ovn\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632785 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-registration-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632818 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-multus-certs\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632797 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-kubernetes\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632832 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-registration-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632846 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-run-netns\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632866 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-daemon-config\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.634888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632877 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-run-netns\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632882 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-multus-certs\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632877 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632951 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysctl-d\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632978 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxbz\" (UniqueName: \"kubernetes.io/projected/1a0b8e16-337d-4350-8a10-754ae14c0ea7-kube-api-access-ctxbz\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.632999 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633002 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-socket-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633028 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1a0b8e16-337d-4350-8a10-754ae14c0ea7-iptables-alerter-script\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633104 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-socket-dir\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633108 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-sysctl-d\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633127 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-cni-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633154 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-k8s-cni-cncf-io\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633162 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-multus-cni-dir\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633175 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-etc-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633184 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f7b9e0a-9c75-402a-9f74-7dc83741af82-host-run-k8s-cni-cncf-io\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633199 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-ovnkube-config\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633218 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-etc-openvswitch\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.635433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633221 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-env-overrides\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633242 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b96ebd94-9b73-4821-8946-4734e772932d-serviceca\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633264 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zqrd\" (UniqueName: \"kubernetes.io/projected/b96ebd94-9b73-4821-8946-4734e772932d-kube-api-access-2zqrd\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633284 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-systemd\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633307 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hql8q\" (UniqueName: \"kubernetes.io/projected/431c9349-7f7f-4d46-8b03-2517188be63c-kube-api-access-hql8q\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633334 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-log-socket\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633358 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633404 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-var-lib-kubelet\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633430 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/66ef4cab-5277-4b3a-a87a-8cc03965a437-konnectivity-ca\") pod \"konnectivity-agent-r2klv\" (UID: \"66ef4cab-5277-4b3a-a87a-8cc03965a437\") " pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633453 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-sys-fs\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633476 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-cnibin\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633517 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30d85b9d-16ae-419a-8534-8b142607909e-ovn-node-metrics-cert\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633543 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-systemd\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633565 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d70fbe1-9754-48a1-82b6-1656723cda25-tmp\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633590 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzkxg\" (UniqueName: \"kubernetes.io/projected/6d70fbe1-9754-48a1-82b6-1656723cda25-kube-api-access-hzkxg\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633609 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-cni-bin\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633635 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkx4\" (UniqueName: \"kubernetes.io/projected/ceb05455-1014-4f0a-bf0c-498593be94d6-kube-api-access-mdkx4\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.636131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633652 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-ovnkube-config\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633681 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b96ebd94-9b73-4821-8946-4734e772932d-serviceca\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633716 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ceb05455-1014-4f0a-bf0c-498593be94d6-sys-fs\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633730 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-log-socket\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633757 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/431c9349-7f7f-4d46-8b03-2517188be63c-cnibin\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633775 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-var-lib-kubelet\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.633815 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-run-systemd\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.634001 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/431c9349-7f7f-4d46-8b03-2517188be63c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.634060 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-systemd\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.634101 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/66ef4cab-5277-4b3a-a87a-8cc03965a437-konnectivity-ca\") pod \"konnectivity-agent-r2klv\" (UID: \"66ef4cab-5277-4b3a-a87a-8cc03965a437\") " pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.634103 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30d85b9d-16ae-419a-8534-8b142607909e-host-cni-bin\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.634357 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30d85b9d-16ae-419a-8534-8b142607909e-env-overrides\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.636243 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d70fbe1-9754-48a1-82b6-1656723cda25-tmp\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.636320 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d70fbe1-9754-48a1-82b6-1656723cda25-etc-tuned\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.636527 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30d85b9d-16ae-419a-8534-8b142607909e-ovn-node-metrics-cert\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.636884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.636540 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/66ef4cab-5277-4b3a-a87a-8cc03965a437-agent-certs\") pod \"konnectivity-agent-r2klv\" (UID: \"66ef4cab-5277-4b3a-a87a-8cc03965a437\") " pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:38.641688 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.641652 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:38.641688 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.641674 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:38.641688 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.641687 2582 projected.go:194] Error preparing data for projected volume kube-api-access-7n7sf for pod openshift-network-diagnostics/network-check-target-dnqkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:38.642015 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:38.641782 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf podName:70f0fbee-2214-4d11-8550-54879ecb58b1 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.141737012 +0000 UTC m=+2.128977887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7n7sf" (UniqueName: "kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf") pod "network-check-target-dnqkh" (UID: "70f0fbee-2214-4d11-8550-54879ecb58b1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:38.643764 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.643741 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzl4f\" (UniqueName: \"kubernetes.io/projected/1f7b9e0a-9c75-402a-9f74-7dc83741af82-kube-api-access-kzl4f\") pod \"multus-dm595\" (UID: \"1f7b9e0a-9c75-402a-9f74-7dc83741af82\") " pod="openshift-multus/multus-dm595" Apr 23 13:31:38.644156 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.644136 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxbz\" (UniqueName: \"kubernetes.io/projected/1a0b8e16-337d-4350-8a10-754ae14c0ea7-kube-api-access-ctxbz\") pod \"iptables-alerter-rgtnc\" (UID: \"1a0b8e16-337d-4350-8a10-754ae14c0ea7\") " pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.644247 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.644188 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzkxg\" (UniqueName: \"kubernetes.io/projected/6d70fbe1-9754-48a1-82b6-1656723cda25-kube-api-access-hzkxg\") pod \"tuned-9ss9q\" (UID: \"6d70fbe1-9754-48a1-82b6-1656723cda25\") " pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.644306 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.644249 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hb7s\" (UniqueName: \"kubernetes.io/projected/30d85b9d-16ae-419a-8534-8b142607909e-kube-api-access-9hb7s\") pod \"ovnkube-node-cx2lr\" (UID: \"30d85b9d-16ae-419a-8534-8b142607909e\") " pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.644519 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.644498 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvm4f\" (UniqueName: \"kubernetes.io/projected/bf879d65-39bb-4d9a-aa57-7d499026e167-kube-api-access-fvm4f\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:38.644580 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.644505 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zqrd\" (UniqueName: \"kubernetes.io/projected/b96ebd94-9b73-4821-8946-4734e772932d-kube-api-access-2zqrd\") pod \"node-ca-kz76p\" (UID: \"b96ebd94-9b73-4821-8946-4734e772932d\") " pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.644776 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.644759 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdkx4\" (UniqueName: \"kubernetes.io/projected/ceb05455-1014-4f0a-bf0c-498593be94d6-kube-api-access-mdkx4\") pod \"aws-ebs-csi-driver-node-n5mz6\" (UID: \"ceb05455-1014-4f0a-bf0c-498593be94d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.645053 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.645038 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hql8q\" (UniqueName: \"kubernetes.io/projected/431c9349-7f7f-4d46-8b03-2517188be63c-kube-api-access-hql8q\") pod \"multus-additional-cni-plugins-vdtkd\" (UID: \"431c9349-7f7f-4d46-8b03-2517188be63c\") " pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.700699 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:38.700669 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod561727a35e04946faef12be860e97824.slice/crio-c880978c11e0069b59cc80a601932d9cfa2822be3612adc9936fb6c886f3c2b6 WatchSource:0}: Error finding container c880978c11e0069b59cc80a601932d9cfa2822be3612adc9936fb6c886f3c2b6: Status 404 returned error can't find the container with id c880978c11e0069b59cc80a601932d9cfa2822be3612adc9936fb6c886f3c2b6 Apr 23 13:31:38.704245 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.704230 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:31:38.828721 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.828641 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:38.834851 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:38.834811 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ef4cab_5277_4b3a_a87a_8cc03965a437.slice/crio-168f5e933d386ec72a651185e2d0171295a8a93ed62ba512bd820923b1d25e94 WatchSource:0}: Error finding container 168f5e933d386ec72a651185e2d0171295a8a93ed62ba512bd820923b1d25e94: Status 404 returned error can't find the container with id 168f5e933d386ec72a651185e2d0171295a8a93ed62ba512bd820923b1d25e94 Apr 23 13:31:38.838646 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.838629 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" Apr 23 13:31:38.844582 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:38.844561 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb05455_1014_4f0a_bf0c_498593be94d6.slice/crio-1d2aaa63393d73547d01a0916ff99f3340db03ec5ee531816a0079e4a2f2ca46 WatchSource:0}: Error finding container 1d2aaa63393d73547d01a0916ff99f3340db03ec5ee531816a0079e4a2f2ca46: Status 404 returned error can't find the container with id 1d2aaa63393d73547d01a0916ff99f3340db03ec5ee531816a0079e4a2f2ca46 Apr 23 13:31:38.866438 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.866414 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kz76p" Apr 23 13:31:38.872105 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:38.872082 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96ebd94_9b73_4821_8946_4734e772932d.slice/crio-bd2ca8b4c2c1ca7f09d64058a31013ef3d63e9f4c51b7f085dce88d1d55278fd WatchSource:0}: Error finding container bd2ca8b4c2c1ca7f09d64058a31013ef3d63e9f4c51b7f085dce88d1d55278fd: Status 404 returned error can't find the container with id bd2ca8b4c2c1ca7f09d64058a31013ef3d63e9f4c51b7f085dce88d1d55278fd Apr 23 13:31:38.879851 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.879835 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dm595" Apr 23 13:31:38.885301 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:38.885282 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7b9e0a_9c75_402a_9f74_7dc83741af82.slice/crio-eb443f8b89b464ec3cd5f0fb5e97ffe9f5379f9f0005dcb030beff4630ec71c7 WatchSource:0}: Error finding container eb443f8b89b464ec3cd5f0fb5e97ffe9f5379f9f0005dcb030beff4630ec71c7: Status 404 returned error can't find the container with id eb443f8b89b464ec3cd5f0fb5e97ffe9f5379f9f0005dcb030beff4630ec71c7 Apr 23 13:31:38.893227 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.893210 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:31:38.898499 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:38.898478 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30d85b9d_16ae_419a_8534_8b142607909e.slice/crio-bcd55f681fd578424b7903dfb71a423c073fcb1bbc3f79a30a00475953321043 WatchSource:0}: Error finding container bcd55f681fd578424b7903dfb71a423c073fcb1bbc3f79a30a00475953321043: Status 404 returned error can't find the container with id bcd55f681fd578424b7903dfb71a423c073fcb1bbc3f79a30a00475953321043 Apr 23 13:31:38.907397 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.907382 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" Apr 23 13:31:38.912957 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:38.912937 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d70fbe1_9754_48a1_82b6_1656723cda25.slice/crio-03ed99f24284b69800f52ba342f622259527880e65fea1ffc415669a74603382 WatchSource:0}: Error finding container 03ed99f24284b69800f52ba342f622259527880e65fea1ffc415669a74603382: Status 404 returned error can't find the container with id 03ed99f24284b69800f52ba342f622259527880e65fea1ffc415669a74603382 Apr 23 13:31:38.924985 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.924967 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" Apr 23 13:31:38.931659 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:38.931643 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rgtnc" Apr 23 13:31:38.939710 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:38.939661 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a0b8e16_337d_4350_8a10_754ae14c0ea7.slice/crio-28f725ef7c16ed3383adeaa906468f3f6a0ca9e42ca88eb3c8e3541a8a0fb68a WatchSource:0}: Error finding container 28f725ef7c16ed3383adeaa906468f3f6a0ca9e42ca88eb3c8e3541a8a0fb68a: Status 404 returned error can't find the container with id 28f725ef7c16ed3383adeaa906468f3f6a0ca9e42ca88eb3c8e3541a8a0fb68a Apr 23 13:31:39.137011 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.136978 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:39.137164 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:39.137117 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:39.137213 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:39.137172 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs podName:bf879d65-39bb-4d9a-aa57-7d499026e167 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:40.137157629 +0000 UTC m=+3.124398486 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs") pod "network-metrics-daemon-8vwqm" (UID: "bf879d65-39bb-4d9a-aa57-7d499026e167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:39.237478 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.237446 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:39.237661 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:39.237569 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:39.237661 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:39.237583 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:39.237661 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:39.237593 2582 projected.go:194] Error preparing data for projected volume kube-api-access-7n7sf for pod openshift-network-diagnostics/network-check-target-dnqkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:39.237815 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:39.237681 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf podName:70f0fbee-2214-4d11-8550-54879ecb58b1 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:40.237663041 +0000 UTC m=+3.224903898 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7n7sf" (UniqueName: "kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf") pod "network-check-target-dnqkh" (UID: "70f0fbee-2214-4d11-8550-54879ecb58b1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:39.376352 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.376311 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:39.435437 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.435358 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:39.551321 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.551229 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:38 +0000 UTC" deadline="2028-01-08 19:13:43.268778502 +0000 UTC" Apr 23 13:31:39.551321 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.551272 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15005h42m3.717510359s" Apr 23 13:31:39.668852 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.668376 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:39.668852 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:39.668492 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:39.679153 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.679101 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"bcd55f681fd578424b7903dfb71a423c073fcb1bbc3f79a30a00475953321043"} Apr 23 13:31:39.694400 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.694323 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dm595" event={"ID":"1f7b9e0a-9c75-402a-9f74-7dc83741af82","Type":"ContainerStarted","Data":"eb443f8b89b464ec3cd5f0fb5e97ffe9f5379f9f0005dcb030beff4630ec71c7"} Apr 23 13:31:39.698636 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.698606 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kz76p" event={"ID":"b96ebd94-9b73-4821-8946-4734e772932d","Type":"ContainerStarted","Data":"bd2ca8b4c2c1ca7f09d64058a31013ef3d63e9f4c51b7f085dce88d1d55278fd"} Apr 23 13:31:39.710731 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.710502 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" event={"ID":"ceb05455-1014-4f0a-bf0c-498593be94d6","Type":"ContainerStarted","Data":"1d2aaa63393d73547d01a0916ff99f3340db03ec5ee531816a0079e4a2f2ca46"} Apr 23 13:31:39.724263 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.724208 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerStarted","Data":"5f925d22622f489610802552c221ba06a631418ec83d062546899364e9d96bdf"} Apr 23 13:31:39.735904 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.735876 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" event={"ID":"6d70fbe1-9754-48a1-82b6-1656723cda25","Type":"ContainerStarted","Data":"03ed99f24284b69800f52ba342f622259527880e65fea1ffc415669a74603382"} Apr 23 13:31:39.741301 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.741275 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r2klv" event={"ID":"66ef4cab-5277-4b3a-a87a-8cc03965a437","Type":"ContainerStarted","Data":"168f5e933d386ec72a651185e2d0171295a8a93ed62ba512bd820923b1d25e94"} Apr 23 13:31:39.764327 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.764296 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" event={"ID":"561727a35e04946faef12be860e97824","Type":"ContainerStarted","Data":"c880978c11e0069b59cc80a601932d9cfa2822be3612adc9936fb6c886f3c2b6"} Apr 23 13:31:39.767882 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:39.767856 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rgtnc" event={"ID":"1a0b8e16-337d-4350-8a10-754ae14c0ea7","Type":"ContainerStarted","Data":"28f725ef7c16ed3383adeaa906468f3f6a0ca9e42ca88eb3c8e3541a8a0fb68a"} Apr 23 13:31:40.088218 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:40.088146 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:40.144701 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:40.144665 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:40.144939 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:40.144873 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:40.145022 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:40.144990 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs podName:bf879d65-39bb-4d9a-aa57-7d499026e167 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:42.144966273 +0000 UTC m=+5.132207134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs") pod "network-metrics-daemon-8vwqm" (UID: "bf879d65-39bb-4d9a-aa57-7d499026e167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:40.245451 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:40.245406 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:40.245631 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:40.245570 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:40.245631 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:40.245588 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:40.245631 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:40.245601 2582 projected.go:194] Error preparing data for projected volume kube-api-access-7n7sf for pod openshift-network-diagnostics/network-check-target-dnqkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:40.245849 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:40.245667 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf podName:70f0fbee-2214-4d11-8550-54879ecb58b1 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:42.245649442 +0000 UTC m=+5.232890303 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7n7sf" (UniqueName: "kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf") pod "network-check-target-dnqkh" (UID: "70f0fbee-2214-4d11-8550-54879ecb58b1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:40.551775 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:40.551727 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:38 +0000 UTC" deadline="2028-01-08 10:20:32.401326544 +0000 UTC" Apr 23 13:31:40.551775 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:40.551767 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14996h48m51.849563138s" Apr 23 13:31:40.666092 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:40.666057 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:40.666283 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:40.666200 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:40.925982 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:40.925632 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:41.667572 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:41.666109 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:41.667572 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:41.666241 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:42.163729 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:42.163699 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:42.163897 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:42.163824 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:42.163897 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:42.163870 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs podName:bf879d65-39bb-4d9a-aa57-7d499026e167 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:46.163855392 +0000 UTC m=+9.151096261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs") pod "network-metrics-daemon-8vwqm" (UID: "bf879d65-39bb-4d9a-aa57-7d499026e167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:42.264565 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:42.264529 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:42.264744 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:42.264700 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:42.264744 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:42.264720 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:42.264744 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:42.264732 2582 projected.go:194] Error preparing data for projected volume kube-api-access-7n7sf for pod openshift-network-diagnostics/network-check-target-dnqkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:42.264902 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:42.264787 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf podName:70f0fbee-2214-4d11-8550-54879ecb58b1 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:46.264768714 +0000 UTC m=+9.252009591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7n7sf" (UniqueName: "kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf") pod "network-check-target-dnqkh" (UID: "70f0fbee-2214-4d11-8550-54879ecb58b1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:42.665750 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:42.665723 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:42.665958 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:42.665834 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:43.666515 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:43.666006 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:43.666515 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:43.666145 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:44.665542 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:44.665506 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:44.665704 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:44.665646 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:45.669430 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:45.669398 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:45.669874 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:45.669519 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:46.196852 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:46.196815 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:46.197061 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:46.196961 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:46.197061 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:46.197044 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs podName:bf879d65-39bb-4d9a-aa57-7d499026e167 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.197022357 +0000 UTC m=+17.184263217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs") pod "network-metrics-daemon-8vwqm" (UID: "bf879d65-39bb-4d9a-aa57-7d499026e167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:46.297743 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:46.297704 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:46.297979 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:46.297932 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:46.298046 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:46.298018 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:46.298046 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:46.298031 2582 projected.go:194] Error preparing data for projected volume kube-api-access-7n7sf for pod openshift-network-diagnostics/network-check-target-dnqkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:46.298113 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:46.298081 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf podName:70f0fbee-2214-4d11-8550-54879ecb58b1 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.298065506 +0000 UTC m=+17.285306363 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7n7sf" (UniqueName: "kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf") pod "network-check-target-dnqkh" (UID: "70f0fbee-2214-4d11-8550-54879ecb58b1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:46.665686 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:46.665653 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:46.665872 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:46.665775 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:47.674712 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:47.674678 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:47.675131 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:47.674826 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:48.665882 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:48.665846 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:48.666072 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:48.665978 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:49.665698 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:49.665661 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:49.666198 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:49.665788 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:50.665630 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.665598 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:50.665831 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:50.665727 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:50.838506 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.838465 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9tk4c"] Apr 23 13:31:50.882511 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.882473 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:50.886693 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.886663 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:31:50.886843 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.886720 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:31:50.887021 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.887003 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kbm2f\"" Apr 23 13:31:50.933000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.932900 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25990e8e-c5b7-435c-8980-d1c4bd84116d-hosts-file\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:50.933000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.932959 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25990e8e-c5b7-435c-8980-d1c4bd84116d-tmp-dir\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:50.933210 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:50.933060 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvks4\" (UniqueName: \"kubernetes.io/projected/25990e8e-c5b7-435c-8980-d1c4bd84116d-kube-api-access-hvks4\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:51.034180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:51.034126 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25990e8e-c5b7-435c-8980-d1c4bd84116d-hosts-file\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:51.034180 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:51.034171 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25990e8e-c5b7-435c-8980-d1c4bd84116d-tmp-dir\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:51.034398 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:51.034230 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvks4\" (UniqueName: \"kubernetes.io/projected/25990e8e-c5b7-435c-8980-d1c4bd84116d-kube-api-access-hvks4\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:51.034398 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:51.034262 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25990e8e-c5b7-435c-8980-d1c4bd84116d-hosts-file\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:51.034578 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:51.034558 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25990e8e-c5b7-435c-8980-d1c4bd84116d-tmp-dir\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:51.049565 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:51.049536 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvks4\" (UniqueName: \"kubernetes.io/projected/25990e8e-c5b7-435c-8980-d1c4bd84116d-kube-api-access-hvks4\") pod \"node-resolver-9tk4c\" (UID: \"25990e8e-c5b7-435c-8980-d1c4bd84116d\") " pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:51.203334 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:51.203252 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9tk4c" Apr 23 13:31:51.666092 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:51.666052 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:51.666526 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:51.666162 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:52.665739 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:52.665706 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:52.665973 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:52.665837 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:53.666069 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:53.666033 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:53.666517 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:53.666163 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:54.255352 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:54.255314 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:54.255555 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:54.255474 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:54.255555 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:54.255550 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs podName:bf879d65-39bb-4d9a-aa57-7d499026e167 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:10.255528317 +0000 UTC m=+33.242769196 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs") pod "network-metrics-daemon-8vwqm" (UID: "bf879d65-39bb-4d9a-aa57-7d499026e167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:54.356083 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:54.356036 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:54.356265 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:54.356220 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:54.356265 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:54.356247 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:54.356265 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:54.356262 2582 projected.go:194] Error preparing data for projected volume kube-api-access-7n7sf for pod openshift-network-diagnostics/network-check-target-dnqkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:54.356418 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:54.356319 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf podName:70f0fbee-2214-4d11-8550-54879ecb58b1 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:10.356301104 +0000 UTC m=+33.343541961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7n7sf" (UniqueName: "kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf") pod "network-check-target-dnqkh" (UID: "70f0fbee-2214-4d11-8550-54879ecb58b1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:54.665752 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:54.665719 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:54.665931 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:54.665828 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:55.024738 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.024662 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-msdj4"] Apr 23 13:31:55.059827 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.059792 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.060030 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:55.059879 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:31:55.162494 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.162451 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97134a37-e40a-4587-b02d-795b8a714cc0-kubelet-config\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.162494 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.162515 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97134a37-e40a-4587-b02d-795b8a714cc0-dbus\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.162735 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.162543 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.263841 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.263792 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97134a37-e40a-4587-b02d-795b8a714cc0-kubelet-config\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.264042 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.263858 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97134a37-e40a-4587-b02d-795b8a714cc0-dbus\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.264042 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.263884 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.264042 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.263961 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97134a37-e40a-4587-b02d-795b8a714cc0-kubelet-config\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.264163 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:55.264036 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:55.264163 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.264072 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97134a37-e40a-4587-b02d-795b8a714cc0-dbus\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.264163 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:55.264107 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret podName:97134a37-e40a-4587-b02d-795b8a714cc0 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:55.764086233 +0000 UTC m=+18.751327104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret") pod "global-pull-secret-syncer-msdj4" (UID: "97134a37-e40a-4587-b02d-795b8a714cc0") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:55.666353 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.666311 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:55.666525 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:55.666427 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:55.767784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:55.767742 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:55.767982 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:55.767873 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:55.767982 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:55.767956 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret podName:97134a37-e40a-4587-b02d-795b8a714cc0 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:56.767936938 +0000 UTC m=+19.755177796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret") pod "global-pull-secret-syncer-msdj4" (UID: "97134a37-e40a-4587-b02d-795b8a714cc0") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:56.543603 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:56.543284 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25990e8e_c5b7_435c_8980_d1c4bd84116d.slice/crio-312497199f36611d0775a297b0d133a3ac74907d03c6c5e06e4571c57ed4cfd5 WatchSource:0}: Error finding container 312497199f36611d0775a297b0d133a3ac74907d03c6c5e06e4571c57ed4cfd5: Status 404 returned error can't find the container with id 312497199f36611d0775a297b0d133a3ac74907d03c6c5e06e4571c57ed4cfd5 Apr 23 13:31:56.548890 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:31:56.548859 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6662ba3d19b8f35202b9c72562560f0.slice/crio-b4970e0edecc17059c4674b743321c97bbbdaef6ab822c03f5b7246c65033427 WatchSource:0}: Error finding container b4970e0edecc17059c4674b743321c97bbbdaef6ab822c03f5b7246c65033427: Status 404 returned error can't find the container with id b4970e0edecc17059c4674b743321c97bbbdaef6ab822c03f5b7246c65033427 Apr 23 13:31:56.666276 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:56.666248 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:56.666276 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:56.666269 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:56.666424 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:56.666354 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:31:56.666488 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:56.666468 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:56.774465 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:56.774438 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:56.774571 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:56.774549 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:56.774605 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:56.774601 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret podName:97134a37-e40a-4587-b02d-795b8a714cc0 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:58.774587288 +0000 UTC m=+21.761828145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret") pod "global-pull-secret-syncer-msdj4" (UID: "97134a37-e40a-4587-b02d-795b8a714cc0") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:56.805572 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:56.805535 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" event={"ID":"e6662ba3d19b8f35202b9c72562560f0","Type":"ContainerStarted","Data":"b4970e0edecc17059c4674b743321c97bbbdaef6ab822c03f5b7246c65033427"} Apr 23 13:31:56.806877 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:56.806845 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9tk4c" event={"ID":"25990e8e-c5b7-435c-8980-d1c4bd84116d","Type":"ContainerStarted","Data":"312497199f36611d0775a297b0d133a3ac74907d03c6c5e06e4571c57ed4cfd5"} Apr 23 13:31:57.669384 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.668898 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:57.670000 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:57.669474 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:57.810571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.810435 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9tk4c" event={"ID":"25990e8e-c5b7-435c-8980-d1c4bd84116d","Type":"ContainerStarted","Data":"0fe4e74765ec72e842090b66d745da1ee4f70ec1a4ac96af77705d53844183c8"} Apr 23 13:31:57.812160 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.812138 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:31:57.812562 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.812532 2582 generic.go:358] "Generic (PLEG): container finished" podID="30d85b9d-16ae-419a-8534-8b142607909e" containerID="9ac325cccf516995bc36baaa2d3e592bb55a202195de6a77e0d2901c40339f04" exitCode=1 Apr 23 13:31:57.812659 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.812595 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerDied","Data":"9ac325cccf516995bc36baaa2d3e592bb55a202195de6a77e0d2901c40339f04"} Apr 23 13:31:57.812659 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.812628 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"ba6f159de4082422664520452f5049cd482bba2b46149e498baa9a1977727d1c"} Apr 23 13:31:57.814357 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.814334 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dm595" event={"ID":"1f7b9e0a-9c75-402a-9f74-7dc83741af82","Type":"ContainerStarted","Data":"d22f24f2f69a077827f39363f0756ce3a14ec02e24862e1b13dcd38c91f5a37c"} Apr 23 13:31:57.816104 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.816082 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kz76p" event={"ID":"b96ebd94-9b73-4821-8946-4734e772932d","Type":"ContainerStarted","Data":"46c4cb7405ac462f5a553fb268907db68339b525e75d67fa652b71d0d508ced0"} Apr 23 13:31:57.817718 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.817693 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" event={"ID":"ceb05455-1014-4f0a-bf0c-498593be94d6","Type":"ContainerStarted","Data":"c889c0bf71a177592317f5f8fde8d3bc9cd7552282925a486b0a1a80d8bf5a92"} Apr 23 13:31:57.819421 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.819396 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerStarted","Data":"2dc7b62e9cafa97fdb72aa96b3a536c8e7975187157ecef2d9b7afb57643f3db"} Apr 23 13:31:57.820878 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.820853 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" event={"ID":"6d70fbe1-9754-48a1-82b6-1656723cda25","Type":"ContainerStarted","Data":"c211c3b107e11fd0325ed92a54083144d4b7005505dd5f3c93b3b06313f9305a"} Apr 23 13:31:57.822834 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.822807 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r2klv" event={"ID":"66ef4cab-5277-4b3a-a87a-8cc03965a437","Type":"ContainerStarted","Data":"992df07f557ee33528ad2430fe74fbaf959f8aa9c1eee79df8f4616c127ad2c6"} Apr 23 13:31:57.824745 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.824720 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" event={"ID":"561727a35e04946faef12be860e97824","Type":"ContainerStarted","Data":"d497ddfc933652e973f5591d2de9b36a21c17926508aedde248336926bf9f3f6"} Apr 23 13:31:57.826149 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.825332 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9tk4c" podStartSLOduration=7.825317892 podStartE2EDuration="7.825317892s" podCreationTimestamp="2026-04-23 13:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:57.825249925 +0000 UTC m=+20.812490804" watchObservedRunningTime="2026-04-23 13:31:57.825317892 +0000 UTC m=+20.812558773" Apr 23 13:31:57.838529 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.838478 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-r2klv" podStartSLOduration=3.135239967 podStartE2EDuration="20.838462269s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.836274451 +0000 UTC m=+1.823515310" lastFinishedPulling="2026-04-23 13:31:56.539496741 +0000 UTC m=+19.526737612" observedRunningTime="2026-04-23 13:31:57.838453987 +0000 UTC m=+20.825694868" watchObservedRunningTime="2026-04-23 13:31:57.838462269 +0000 UTC m=+20.825703148" Apr 23 13:31:57.858720 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.858658 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dm595" podStartSLOduration=2.896947893 podStartE2EDuration="20.858638367s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.886643248 +0000 UTC m=+1.873884108" lastFinishedPulling="2026-04-23 13:31:56.848333724 +0000 UTC m=+19.835574582" observedRunningTime="2026-04-23 13:31:57.858407621 +0000 UTC m=+20.845648502" watchObservedRunningTime="2026-04-23 13:31:57.858638367 +0000 UTC m=+20.845879250" Apr 23 13:31:57.878062 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.878012 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9ss9q" podStartSLOduration=3.248118036 podStartE2EDuration="20.877996288s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.914243641 +0000 UTC m=+1.901484499" lastFinishedPulling="2026-04-23 13:31:56.544121893 +0000 UTC m=+19.531362751" observedRunningTime="2026-04-23 13:31:57.873654309 +0000 UTC m=+20.860895188" watchObservedRunningTime="2026-04-23 13:31:57.877996288 +0000 UTC m=+20.865237166" Apr 23 13:31:57.909439 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:57.909381 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-40.ec2.internal" podStartSLOduration=19.909362708 podStartE2EDuration="19.909362708s" podCreationTimestamp="2026-04-23 13:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:57.908688329 +0000 UTC m=+20.895929207" watchObservedRunningTime="2026-04-23 13:31:57.909362708 +0000 UTC m=+20.896603588" Apr 23 13:31:58.519134 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.519106 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:31:58.601689 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.601521 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:31:58.519129037Z","UUID":"8471f020-28f5-4fe9-8d63-356df9a03cf8","Handler":null,"Name":"","Endpoint":""} Apr 23 13:31:58.604145 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.604120 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:31:58.604277 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.604152 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:31:58.665900 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.665866 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:58.666087 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.665876 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:31:58.666087 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:58.665993 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:31:58.666160 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:58.666084 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:31:58.791782 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.791747 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:31:58.792270 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:58.791854 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:58.792270 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:58.791905 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret podName:97134a37-e40a-4587-b02d-795b8a714cc0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:02.791890653 +0000 UTC m=+25.779131510 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret") pod "global-pull-secret-syncer-msdj4" (UID: "97134a37-e40a-4587-b02d-795b8a714cc0") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:58.828214 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.828180 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rgtnc" event={"ID":"1a0b8e16-337d-4350-8a10-754ae14c0ea7","Type":"ContainerStarted","Data":"92d05d142fd374c85aa87a0e721135bf6a23b2a16bda05fc5426138bcd274ba3"} Apr 23 13:31:58.830408 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.830379 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:31:58.830724 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.830703 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"085d79899281d1c00dd3a9e9161b31550bde84a6f57ff629047c0e29ddf6961a"} Apr 23 13:31:58.830810 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.830729 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"0672c4943a3a8ca2773f5c3628e22c8f86394e46f6fdd39834dd3b8a4b980f70"} Apr 23 13:31:58.830810 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.830739 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"a9b318e78fab5c18466a55ad6e0fd6e701adab5868ba6d8cb94441a0b9a1e030"} Apr 23 13:31:58.830810 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.830747 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"72a7dd9e21643ba2cd1b1aea05e2ce802be468a85ee17c416baabd9dd80f62ee"} Apr 23 13:31:58.832463 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.832393 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" event={"ID":"ceb05455-1014-4f0a-bf0c-498593be94d6","Type":"ContainerStarted","Data":"85ec13e0c69eb9602f43583c548b19ba036c797109889013d6f16d5311ad2c24"} Apr 23 13:31:58.833620 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.833596 2582 generic.go:358] "Generic (PLEG): container finished" podID="e6662ba3d19b8f35202b9c72562560f0" containerID="cd2e688d2fe83fda6a417753286cd5b00653e9b2ea6f81c2b156c9b3997ce71f" exitCode=0 Apr 23 13:31:58.833716 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.833651 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" event={"ID":"e6662ba3d19b8f35202b9c72562560f0","Type":"ContainerDied","Data":"cd2e688d2fe83fda6a417753286cd5b00653e9b2ea6f81c2b156c9b3997ce71f"} Apr 23 13:31:58.835054 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.835031 2582 generic.go:358] "Generic (PLEG): container finished" podID="431c9349-7f7f-4d46-8b03-2517188be63c" containerID="2dc7b62e9cafa97fdb72aa96b3a536c8e7975187157ecef2d9b7afb57643f3db" exitCode=0 Apr 23 13:31:58.835202 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.835126 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerDied","Data":"2dc7b62e9cafa97fdb72aa96b3a536c8e7975187157ecef2d9b7afb57643f3db"} Apr 23 13:31:58.842812 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.842536 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kz76p" podStartSLOduration=4.177430107 podStartE2EDuration="21.84252022s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.873545156 +0000 UTC m=+1.860786013" lastFinishedPulling="2026-04-23 13:31:56.538635267 +0000 UTC m=+19.525876126" observedRunningTime="2026-04-23 13:31:57.923777802 +0000 UTC m=+20.911018680" watchObservedRunningTime="2026-04-23 13:31:58.84252022 +0000 UTC m=+21.829761103" Apr 23 13:31:58.859392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:58.859312 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rgtnc" podStartSLOduration=4.260718894 podStartE2EDuration="21.859296727s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.940954796 +0000 UTC m=+1.928195653" lastFinishedPulling="2026-04-23 13:31:56.539532625 +0000 UTC m=+19.526773486" observedRunningTime="2026-04-23 13:31:58.842440723 +0000 UTC m=+21.829681604" watchObservedRunningTime="2026-04-23 13:31:58.859296727 +0000 UTC m=+21.846537618" Apr 23 13:31:59.432998 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.432965 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:59.433507 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.433491 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:59.666430 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.666399 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:31:59.666654 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:31:59.666522 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:31:59.839462 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.839419 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" event={"ID":"ceb05455-1014-4f0a-bf0c-498593be94d6","Type":"ContainerStarted","Data":"239cfb84ddc8d5612ff6e852c43d076b1e516dc2bb3688b92ee4fa4c56b35c0e"} Apr 23 13:31:59.841296 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.841268 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" event={"ID":"e6662ba3d19b8f35202b9c72562560f0","Type":"ContainerStarted","Data":"25bb1ed455993b6578d0f8c44bd87105bbdae572a09c4df69212c35fd9935a14"} Apr 23 13:31:59.841691 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.841671 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:59.842174 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.842158 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-r2klv" Apr 23 13:31:59.859000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.858945 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5mz6" podStartSLOduration=2.354446641 podStartE2EDuration="22.85893217s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.845949297 +0000 UTC m=+1.833190153" lastFinishedPulling="2026-04-23 13:31:59.35043482 +0000 UTC m=+22.337675682" observedRunningTime="2026-04-23 13:31:59.858696593 +0000 UTC m=+22.845937482" watchObservedRunningTime="2026-04-23 13:31:59.85893217 +0000 UTC m=+22.846173039" Apr 23 13:31:59.872691 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:31:59.872646 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-40.ec2.internal" podStartSLOduration=21.872630227 podStartE2EDuration="21.872630227s" podCreationTimestamp="2026-04-23 13:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:59.872396227 +0000 UTC m=+22.859637105" watchObservedRunningTime="2026-04-23 13:31:59.872630227 +0000 UTC m=+22.859871087" Apr 23 13:32:00.665869 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:00.665840 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:00.665869 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:00.665842 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:00.666127 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:00.666018 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:32:00.666127 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:00.666119 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:32:00.846008 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:00.845933 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:32:00.846468 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:00.846387 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"da926ed58af9655e28a6a0d7b6d022ab975685c9987563b935ca6d4e36b62004"} Apr 23 13:32:01.665633 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:01.665584 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:01.665826 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:01.665728 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:32:02.665938 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:02.665883 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:02.666426 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:02.665883 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:02.666426 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:02.666022 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:32:02.666426 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:02.666107 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:32:02.820940 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:02.820880 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:02.821135 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:02.821057 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:02.821182 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:02.821137 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret podName:97134a37-e40a-4587-b02d-795b8a714cc0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:10.821115004 +0000 UTC m=+33.808355864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret") pod "global-pull-secret-syncer-msdj4" (UID: "97134a37-e40a-4587-b02d-795b8a714cc0") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:03.666015 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.665801 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:03.666623 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:03.666096 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:32:03.854983 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.854955 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:32:03.855298 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.855274 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"95ea370f6b0cb3914976678580fb93a966a10e0f4c8ed75261afd6b9afb154ee"} Apr 23 13:32:03.855563 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.855541 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:32:03.855563 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.855573 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:32:03.855757 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.855741 2582 scope.go:117] "RemoveContainer" containerID="9ac325cccf516995bc36baaa2d3e592bb55a202195de6a77e0d2901c40339f04" Apr 23 13:32:03.858658 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.858632 2582 generic.go:358] "Generic (PLEG): container finished" podID="431c9349-7f7f-4d46-8b03-2517188be63c" containerID="e2d18476bb269a06024acad5b3a5ac18a3425be6193bc99734bf6ffbf1507b6d" exitCode=0 Apr 23 13:32:03.858734 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.858684 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerDied","Data":"e2d18476bb269a06024acad5b3a5ac18a3425be6193bc99734bf6ffbf1507b6d"} Apr 23 13:32:03.872083 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:03.872055 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:32:04.666248 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.666217 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:04.667311 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.666277 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:04.667311 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:04.666400 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:32:04.667311 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:04.666553 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:32:04.864057 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.863976 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:32:04.864349 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.864327 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" event={"ID":"30d85b9d-16ae-419a-8534-8b142607909e","Type":"ContainerStarted","Data":"f91abe1fef3047c25a20f72db2c8f8ac0a542e450a4d234bcebe656eab60a375"} Apr 23 13:32:04.864654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.864636 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:32:04.866329 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.866307 2582 generic.go:358] "Generic (PLEG): container finished" podID="431c9349-7f7f-4d46-8b03-2517188be63c" containerID="89640e7b3556f402ec91c2995900f4ff080057c366bb4bdd6d22148b316b134b" exitCode=0 Apr 23 13:32:04.866409 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.866349 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerDied","Data":"89640e7b3556f402ec91c2995900f4ff080057c366bb4bdd6d22148b316b134b"} Apr 23 13:32:04.895972 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.895942 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:32:04.896329 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.896290 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" podStartSLOduration=8.840633325 podStartE2EDuration="27.896277009s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.900947109 +0000 UTC m=+1.888187969" lastFinishedPulling="2026-04-23 13:31:57.956590797 +0000 UTC m=+20.943831653" observedRunningTime="2026-04-23 13:32:04.894964549 +0000 UTC m=+27.882205428" watchObservedRunningTime="2026-04-23 13:32:04.896277009 +0000 UTC m=+27.883517887" Apr 23 13:32:04.982900 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.982861 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8vwqm"] Apr 23 13:32:04.983105 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.983009 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:04.983165 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:04.983117 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:32:04.986090 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.986059 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-msdj4"] Apr 23 13:32:04.986221 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.986171 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:04.986293 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:04.986269 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:32:04.986568 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.986551 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dnqkh"] Apr 23 13:32:04.986661 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:04.986644 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:04.986775 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:04.986729 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:32:06.666294 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:06.666107 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:06.666760 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:06.666107 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:06.666760 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:06.666395 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:32:06.666760 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:06.666123 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:06.666760 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:06.666469 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:32:06.666760 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:06.666521 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:32:06.874441 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:06.874405 2582 generic.go:358] "Generic (PLEG): container finished" podID="431c9349-7f7f-4d46-8b03-2517188be63c" containerID="bb9b71ca5f66a6357bac17e9f48b36cb95acef418bce57ffc4b6cea00a16ba13" exitCode=0 Apr 23 13:32:06.874621 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:06.874527 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerDied","Data":"bb9b71ca5f66a6357bac17e9f48b36cb95acef418bce57ffc4b6cea00a16ba13"} Apr 23 13:32:08.666195 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:08.665864 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:08.666849 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:08.666363 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:32:08.666849 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:08.666695 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:08.666849 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:08.666734 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:08.667009 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:08.666852 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:32:08.667070 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:08.667038 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:32:10.273869 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:10.273832 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:10.274274 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.274033 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:10.274274 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.274109 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs podName:bf879d65-39bb-4d9a-aa57-7d499026e167 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.27409057 +0000 UTC m=+65.261331427 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs") pod "network-metrics-daemon-8vwqm" (UID: "bf879d65-39bb-4d9a-aa57-7d499026e167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:10.374499 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:10.374460 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:10.374696 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.374667 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:10.374696 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.374698 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:10.374861 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.374713 2582 projected.go:194] Error preparing data for projected volume kube-api-access-7n7sf for pod openshift-network-diagnostics/network-check-target-dnqkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:10.374861 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.374784 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf podName:70f0fbee-2214-4d11-8550-54879ecb58b1 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.374764795 +0000 UTC m=+65.362005675 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7n7sf" (UniqueName: "kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf") pod "network-check-target-dnqkh" (UID: "70f0fbee-2214-4d11-8550-54879ecb58b1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:10.665401 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:10.665368 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:10.665608 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:10.665367 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:10.665608 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.665499 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-msdj4" podUID="97134a37-e40a-4587-b02d-795b8a714cc0" Apr 23 13:32:10.665608 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:10.665367 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:10.665608 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.665567 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dnqkh" podUID="70f0fbee-2214-4d11-8550-54879ecb58b1" Apr 23 13:32:10.665772 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.665672 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vwqm" podUID="bf879d65-39bb-4d9a-aa57-7d499026e167" Apr 23 13:32:10.879470 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:10.879419 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:10.879679 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.879588 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:10.879679 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:10.879664 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret podName:97134a37-e40a-4587-b02d-795b8a714cc0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:26.87964285 +0000 UTC m=+49.866883707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret") pod "global-pull-secret-syncer-msdj4" (UID: "97134a37-e40a-4587-b02d-795b8a714cc0") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:11.753309 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.753278 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-40.ec2.internal" event="NodeReady" Apr 23 13:32:11.753895 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.753419 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:11.789395 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.789361 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8664d9c4bc-prnck"] Apr 23 13:32:11.823910 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.823879 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8jlgh"] Apr 23 13:32:11.824191 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.824171 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.826783 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.826756 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:32:11.826947 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.826790 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:32:11.826947 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.826802 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cq79g\"" Apr 23 13:32:11.827078 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.826711 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:32:11.834651 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.834622 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:32:11.843376 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.843353 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qx8mp"] Apr 23 13:32:11.843545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.843521 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:11.845742 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.845722 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4t8gv\"" Apr 23 13:32:11.845742 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.845743 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:11.845906 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.845784 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:11.861784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.861758 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8664d9c4bc-prnck"] Apr 23 13:32:11.861784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.861787 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8jlgh"] Apr 23 13:32:11.861913 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.861799 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qx8mp"] Apr 23 13:32:11.861913 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.861885 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:11.864317 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.864294 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:11.864317 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.864310 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:11.865445 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.864571 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4c4k9\"" Apr 23 13:32:11.865445 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.864701 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:11.988169 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988131 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-certificates\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.988329 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988177 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-trusted-ca\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.988329 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988208 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flsv7\" (UniqueName: \"kubernetes.io/projected/3ad71ca4-3e9d-4f24-866e-d3a822733344-kube-api-access-flsv7\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:11.988329 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988228 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-bound-sa-token\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.988329 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988296 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:11.988495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988338 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-image-registry-private-configuration\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.988495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988376 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w25t\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-kube-api-access-2w25t\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.988495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988439 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.988495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988479 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:11.988679 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988506 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-installation-pull-secrets\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.988679 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988563 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ad71ca4-3e9d-4f24-866e-d3a822733344-tmp-dir\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:11.988679 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988607 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afe0d7fe-f3d4-4096-acee-7dd59ac92395-ca-trust-extracted\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:11.988821 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988699 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ad71ca4-3e9d-4f24-866e-d3a822733344-config-volume\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:11.988821 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:11.988731 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkgs7\" (UniqueName: \"kubernetes.io/projected/71016223-1429-40b3-94b8-0fa57c7f235a-kube-api-access-nkgs7\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:12.089813 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.089739 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ad71ca4-3e9d-4f24-866e-d3a822733344-config-volume\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:12.089813 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.089790 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkgs7\" (UniqueName: \"kubernetes.io/projected/71016223-1429-40b3-94b8-0fa57c7f235a-kube-api-access-nkgs7\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:12.090045 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.089828 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-certificates\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090045 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.089854 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-trusted-ca\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090045 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.089911 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flsv7\" (UniqueName: \"kubernetes.io/projected/3ad71ca4-3e9d-4f24-866e-d3a822733344-kube-api-access-flsv7\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:12.090045 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.089971 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-bound-sa-token\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090045 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090010 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:12.090045 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090034 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-image-registry-private-configuration\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090065 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w25t\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-kube-api-access-2w25t\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090096 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090129 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:12.090387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090151 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-installation-pull-secrets\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090180 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ad71ca4-3e9d-4f24-866e-d3a822733344-tmp-dir\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:12.090387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090201 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afe0d7fe-f3d4-4096-acee-7dd59ac92395-ca-trust-extracted\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090338 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ad71ca4-3e9d-4f24-866e-d3a822733344-config-volume\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:12.090726 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.090477 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:12.090726 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.090534 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls podName:3ad71ca4-3e9d-4f24-866e-d3a822733344 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:12.590515345 +0000 UTC m=+35.577756201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls") pod "dns-default-8jlgh" (UID: "3ad71ca4-3e9d-4f24-866e-d3a822733344") : secret "dns-default-metrics-tls" not found Apr 23 13:32:12.090726 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090542 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-certificates\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090726 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.090596 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afe0d7fe-f3d4-4096-acee-7dd59ac92395-ca-trust-extracted\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.090726 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.090686 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:12.090726 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.090706 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8664d9c4bc-prnck: secret "image-registry-tls" not found Apr 23 13:32:12.091059 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.090762 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls podName:afe0d7fe-f3d4-4096-acee-7dd59ac92395 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:12.590737553 +0000 UTC m=+35.577978410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls") pod "image-registry-8664d9c4bc-prnck" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395") : secret "image-registry-tls" not found Apr 23 13:32:12.091059 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.091008 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:12.091059 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.091038 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ad71ca4-3e9d-4f24-866e-d3a822733344-tmp-dir\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:12.091059 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.091045 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert podName:71016223-1429-40b3-94b8-0fa57c7f235a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:12.591034191 +0000 UTC m=+35.578275047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert") pod "ingress-canary-qx8mp" (UID: "71016223-1429-40b3-94b8-0fa57c7f235a") : secret "canary-serving-cert" not found Apr 23 13:32:12.091393 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.091370 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-trusted-ca\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.095119 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.095098 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-installation-pull-secrets\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.095205 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.095177 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-image-registry-private-configuration\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.103418 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.103395 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-bound-sa-token\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.103512 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.103429 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flsv7\" (UniqueName: \"kubernetes.io/projected/3ad71ca4-3e9d-4f24-866e-d3a822733344-kube-api-access-flsv7\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:12.103572 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.103557 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkgs7\" (UniqueName: \"kubernetes.io/projected/71016223-1429-40b3-94b8-0fa57c7f235a-kube-api-access-nkgs7\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:12.105141 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.105118 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w25t\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-kube-api-access-2w25t\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.594928 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.594880 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:12.595132 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.595036 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:12.595132 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.595069 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:12.595132 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.595074 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:12.595275 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.595144 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert podName:71016223-1429-40b3-94b8-0fa57c7f235a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:13.595122496 +0000 UTC m=+36.582363356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert") pod "ingress-canary-qx8mp" (UID: "71016223-1429-40b3-94b8-0fa57c7f235a") : secret "canary-serving-cert" not found Apr 23 13:32:12.595275 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.595158 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:12.595275 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.595168 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:12.595275 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.595173 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8664d9c4bc-prnck: secret "image-registry-tls" not found Apr 23 13:32:12.595275 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.595223 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls podName:3ad71ca4-3e9d-4f24-866e-d3a822733344 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:13.595205832 +0000 UTC m=+36.582446692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls") pod "dns-default-8jlgh" (UID: "3ad71ca4-3e9d-4f24-866e-d3a822733344") : secret "dns-default-metrics-tls" not found Apr 23 13:32:12.595275 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:12.595238 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls podName:afe0d7fe-f3d4-4096-acee-7dd59ac92395 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:13.595230741 +0000 UTC m=+36.582471604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls") pod "image-registry-8664d9c4bc-prnck" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395") : secret "image-registry-tls" not found Apr 23 13:32:12.665577 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.665543 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:12.665748 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.665543 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:12.665979 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.665952 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:12.669752 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.669568 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfgql\"" Apr 23 13:32:12.669752 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.669589 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:12.669752 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.669623 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6f6d\"" Apr 23 13:32:12.669752 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.669568 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:32:12.669752 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.669652 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:12.669752 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:12.669569 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:13.603156 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:13.603121 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:13.603156 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:13.603162 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:13.603692 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:13.603187 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:13.603692 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:13.603277 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:13.603692 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:13.603278 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:13.603692 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:13.603325 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert podName:71016223-1429-40b3-94b8-0fa57c7f235a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.603311369 +0000 UTC m=+38.590552225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert") pod "ingress-canary-qx8mp" (UID: "71016223-1429-40b3-94b8-0fa57c7f235a") : secret "canary-serving-cert" not found Apr 23 13:32:13.603692 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:13.603347 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls podName:3ad71ca4-3e9d-4f24-866e-d3a822733344 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.60333385 +0000 UTC m=+38.590574707 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls") pod "dns-default-8jlgh" (UID: "3ad71ca4-3e9d-4f24-866e-d3a822733344") : secret "dns-default-metrics-tls" not found Apr 23 13:32:13.603692 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:13.603285 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:13.603692 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:13.603361 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8664d9c4bc-prnck: secret "image-registry-tls" not found Apr 23 13:32:13.603692 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:13.603381 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls podName:afe0d7fe-f3d4-4096-acee-7dd59ac92395 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.603374832 +0000 UTC m=+38.590615688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls") pod "image-registry-8664d9c4bc-prnck" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395") : secret "image-registry-tls" not found Apr 23 13:32:13.892461 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:13.892280 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerStarted","Data":"74de1d248a1731855577fc3a5034594247c6ebe7dc6bb1f498a5f06e8758cd5f"} Apr 23 13:32:14.333404 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.333376 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9tk4c_25990e8e-c5b7-435c-8980-d1c4bd84116d/dns-node-resolver/0.log" Apr 23 13:32:14.754096 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.754054 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9lh7p"] Apr 23 13:32:14.778131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.778102 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9lh7p"] Apr 23 13:32:14.778271 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.778219 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:14.781177 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.781149 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 13:32:14.782131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.782109 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 13:32:14.782131 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.782122 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 13:32:14.782303 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.782138 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 13:32:14.782303 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.782156 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-4hjqk\"" Apr 23 13:32:14.896635 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.896599 2582 generic.go:358] "Generic (PLEG): container finished" podID="431c9349-7f7f-4d46-8b03-2517188be63c" containerID="74de1d248a1731855577fc3a5034594247c6ebe7dc6bb1f498a5f06e8758cd5f" exitCode=0 Apr 23 13:32:14.896812 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.896667 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerDied","Data":"74de1d248a1731855577fc3a5034594247c6ebe7dc6bb1f498a5f06e8758cd5f"} Apr 23 13:32:14.914051 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.914021 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d85158b2-b436-451e-a64a-1e9709ac315a-signing-key\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:14.914193 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.914082 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nng\" (UniqueName: \"kubernetes.io/projected/d85158b2-b436-451e-a64a-1e9709ac315a-kube-api-access-l5nng\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:14.914193 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:14.914143 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d85158b2-b436-451e-a64a-1e9709ac315a-signing-cabundle\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:15.015044 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.015016 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d85158b2-b436-451e-a64a-1e9709ac315a-signing-key\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:15.015256 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.015212 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nng\" (UniqueName: \"kubernetes.io/projected/d85158b2-b436-451e-a64a-1e9709ac315a-kube-api-access-l5nng\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:15.015344 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.015305 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d85158b2-b436-451e-a64a-1e9709ac315a-signing-cabundle\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:15.015872 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.015848 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d85158b2-b436-451e-a64a-1e9709ac315a-signing-cabundle\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:15.018336 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.018313 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d85158b2-b436-451e-a64a-1e9709ac315a-signing-key\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:15.032812 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.032792 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nng\" (UniqueName: \"kubernetes.io/projected/d85158b2-b436-451e-a64a-1e9709ac315a-kube-api-access-l5nng\") pod \"service-ca-865cb79987-9lh7p\" (UID: \"d85158b2-b436-451e-a64a-1e9709ac315a\") " pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:15.086458 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.086433 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9lh7p" Apr 23 13:32:15.244348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.244318 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9lh7p"] Apr 23 13:32:15.248303 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:15.248273 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85158b2_b436_451e_a64a_1e9709ac315a.slice/crio-996f875ce57c843a04a263596cdfdd74fe3907394b5ee1569d589847811e0b40 WatchSource:0}: Error finding container 996f875ce57c843a04a263596cdfdd74fe3907394b5ee1569d589847811e0b40: Status 404 returned error can't find the container with id 996f875ce57c843a04a263596cdfdd74fe3907394b5ee1569d589847811e0b40 Apr 23 13:32:15.334105 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.334033 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kz76p_b96ebd94-9b73-4821-8946-4734e772932d/node-ca/0.log" Apr 23 13:32:15.621302 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.621209 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:15.621302 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.621258 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:15.621302 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.621294 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:15.621527 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:15.621380 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:15.621527 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:15.621391 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:15.621527 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:15.621424 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8664d9c4bc-prnck: secret "image-registry-tls" not found Apr 23 13:32:15.621527 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:15.621453 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls podName:3ad71ca4-3e9d-4f24-866e-d3a822733344 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:19.621438327 +0000 UTC m=+42.608679203 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls") pod "dns-default-8jlgh" (UID: "3ad71ca4-3e9d-4f24-866e-d3a822733344") : secret "dns-default-metrics-tls" not found Apr 23 13:32:15.621527 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:15.621477 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls podName:afe0d7fe-f3d4-4096-acee-7dd59ac92395 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:19.621461913 +0000 UTC m=+42.608702770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls") pod "image-registry-8664d9c4bc-prnck" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395") : secret "image-registry-tls" not found Apr 23 13:32:15.621527 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:15.621400 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:15.621527 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:15.621504 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert podName:71016223-1429-40b3-94b8-0fa57c7f235a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:19.621496319 +0000 UTC m=+42.608737177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert") pod "ingress-canary-qx8mp" (UID: "71016223-1429-40b3-94b8-0fa57c7f235a") : secret "canary-serving-cert" not found Apr 23 13:32:15.899750 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.899709 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9lh7p" event={"ID":"d85158b2-b436-451e-a64a-1e9709ac315a","Type":"ContainerStarted","Data":"996f875ce57c843a04a263596cdfdd74fe3907394b5ee1569d589847811e0b40"} Apr 23 13:32:15.902590 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.902560 2582 generic.go:358] "Generic (PLEG): container finished" podID="431c9349-7f7f-4d46-8b03-2517188be63c" containerID="3673eef23ff4d8d18d134b24590cddf9e160edd50a2a19bf89e1b617e440ae9a" exitCode=0 Apr 23 13:32:15.902725 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:15.902610 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerDied","Data":"3673eef23ff4d8d18d134b24590cddf9e160edd50a2a19bf89e1b617e440ae9a"} Apr 23 13:32:16.909109 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:16.909069 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" event={"ID":"431c9349-7f7f-4d46-8b03-2517188be63c","Type":"ContainerStarted","Data":"4ddef34df0e6699e62835304f031f3ab9d4396d0e346482785b6b63744dd740e"} Apr 23 13:32:16.935257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:16.935186 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vdtkd" podStartSLOduration=5.117249266 podStartE2EDuration="39.935163829s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.932642601 +0000 UTC m=+1.919883471" lastFinishedPulling="2026-04-23 13:32:13.750557177 +0000 UTC m=+36.737798034" observedRunningTime="2026-04-23 13:32:16.93511104 +0000 UTC m=+39.922351927" watchObservedRunningTime="2026-04-23 13:32:16.935163829 +0000 UTC m=+39.922404709" Apr 23 13:32:17.912429 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:17.912386 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9lh7p" event={"ID":"d85158b2-b436-451e-a64a-1e9709ac315a","Type":"ContainerStarted","Data":"0e60285813f452908dd43fe46de4a06912dbbd84d14ff1ff5d74650564fb5d16"} Apr 23 13:32:19.658500 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:19.658454 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:19.658500 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:19.658502 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:19.658942 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:19.658549 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:19.658942 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:19.658614 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:19.658942 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:19.658641 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8664d9c4bc-prnck: secret "image-registry-tls" not found Apr 23 13:32:19.658942 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:19.658675 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:19.658942 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:19.658685 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:19.658942 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:19.658702 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls podName:afe0d7fe-f3d4-4096-acee-7dd59ac92395 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:27.658685058 +0000 UTC m=+50.645925936 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls") pod "image-registry-8664d9c4bc-prnck" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395") : secret "image-registry-tls" not found Apr 23 13:32:19.658942 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:19.658724 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls podName:3ad71ca4-3e9d-4f24-866e-d3a822733344 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:27.658709329 +0000 UTC m=+50.645950187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls") pod "dns-default-8jlgh" (UID: "3ad71ca4-3e9d-4f24-866e-d3a822733344") : secret "dns-default-metrics-tls" not found Apr 23 13:32:19.658942 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:19.658741 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert podName:71016223-1429-40b3-94b8-0fa57c7f235a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:27.658732468 +0000 UTC m=+50.645973325 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert") pod "ingress-canary-qx8mp" (UID: "71016223-1429-40b3-94b8-0fa57c7f235a") : secret "canary-serving-cert" not found Apr 23 13:32:26.916244 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:26.916194 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:26.919109 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:26.919074 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97134a37-e40a-4587-b02d-795b8a714cc0-original-pull-secret\") pod \"global-pull-secret-syncer-msdj4\" (UID: \"97134a37-e40a-4587-b02d-795b8a714cc0\") " pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:27.077779 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.077744 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-msdj4" Apr 23 13:32:27.220856 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.220762 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-9lh7p" podStartSLOduration=11.083482615 podStartE2EDuration="13.220744352s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.250118671 +0000 UTC m=+38.237359528" lastFinishedPulling="2026-04-23 13:32:17.387380395 +0000 UTC m=+40.374621265" observedRunningTime="2026-04-23 13:32:17.936304964 +0000 UTC m=+40.923545842" watchObservedRunningTime="2026-04-23 13:32:27.220744352 +0000 UTC m=+50.207985289" Apr 23 13:32:27.221346 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.221326 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-msdj4"] Apr 23 13:32:27.224985 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:27.224953 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97134a37_e40a_4587_b02d_795b8a714cc0.slice/crio-5b6d4cef14d07d9e564433ab16c05f6b2eeaede5a9e2cab28a37ceeaa23f333c WatchSource:0}: Error finding container 5b6d4cef14d07d9e564433ab16c05f6b2eeaede5a9e2cab28a37ceeaa23f333c: Status 404 returned error can't find the container with id 5b6d4cef14d07d9e564433ab16c05f6b2eeaede5a9e2cab28a37ceeaa23f333c Apr 23 13:32:27.722488 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.722457 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:27.722676 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.722506 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:27.722676 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.722605 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:27.725443 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.725412 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ad71ca4-3e9d-4f24-866e-d3a822733344-metrics-tls\") pod \"dns-default-8jlgh\" (UID: \"3ad71ca4-3e9d-4f24-866e-d3a822733344\") " pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:27.725654 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.725629 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"image-registry-8664d9c4bc-prnck\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:27.728833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.728802 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71016223-1429-40b3-94b8-0fa57c7f235a-cert\") pod \"ingress-canary-qx8mp\" (UID: \"71016223-1429-40b3-94b8-0fa57c7f235a\") " pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:27.737569 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.737546 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:27.754392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.754363 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:27.772602 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.772544 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qx8mp" Apr 23 13:32:27.898322 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.898252 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8664d9c4bc-prnck"] Apr 23 13:32:27.903864 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:27.903815 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafe0d7fe_f3d4_4096_acee_7dd59ac92395.slice/crio-dd88015d29a3ff1a533f2e0b00fe913e1081dceea48abad658192b9b7d2c39cd WatchSource:0}: Error finding container dd88015d29a3ff1a533f2e0b00fe913e1081dceea48abad658192b9b7d2c39cd: Status 404 returned error can't find the container with id dd88015d29a3ff1a533f2e0b00fe913e1081dceea48abad658192b9b7d2c39cd Apr 23 13:32:27.923130 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.923094 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8jlgh"] Apr 23 13:32:27.927953 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:27.927896 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad71ca4_3e9d_4f24_866e_d3a822733344.slice/crio-88ed8e7f78e6931ccc96295c1f40e85d59b447e9bcffd57827a9dd96fe161cbb WatchSource:0}: Error finding container 88ed8e7f78e6931ccc96295c1f40e85d59b447e9bcffd57827a9dd96fe161cbb: Status 404 returned error can't find the container with id 88ed8e7f78e6931ccc96295c1f40e85d59b447e9bcffd57827a9dd96fe161cbb Apr 23 13:32:27.932848 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.932804 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jlgh" event={"ID":"3ad71ca4-3e9d-4f24-866e-d3a822733344","Type":"ContainerStarted","Data":"88ed8e7f78e6931ccc96295c1f40e85d59b447e9bcffd57827a9dd96fe161cbb"} Apr 23 13:32:27.936410 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.936383 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" event={"ID":"afe0d7fe-f3d4-4096-acee-7dd59ac92395","Type":"ContainerStarted","Data":"dd88015d29a3ff1a533f2e0b00fe913e1081dceea48abad658192b9b7d2c39cd"} Apr 23 13:32:27.938825 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.938788 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-msdj4" event={"ID":"97134a37-e40a-4587-b02d-795b8a714cc0","Type":"ContainerStarted","Data":"5b6d4cef14d07d9e564433ab16c05f6b2eeaede5a9e2cab28a37ceeaa23f333c"} Apr 23 13:32:27.944435 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:27.944390 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qx8mp"] Apr 23 13:32:27.952971 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:27.952912 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71016223_1429_40b3_94b8_0fa57c7f235a.slice/crio-1de093974895c717de7fd008c5aea98edddf240c2d15a5ec6bf1099e33750e16 WatchSource:0}: Error finding container 1de093974895c717de7fd008c5aea98edddf240c2d15a5ec6bf1099e33750e16: Status 404 returned error can't find the container with id 1de093974895c717de7fd008c5aea98edddf240c2d15a5ec6bf1099e33750e16 Apr 23 13:32:28.946414 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:28.946355 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" event={"ID":"afe0d7fe-f3d4-4096-acee-7dd59ac92395","Type":"ContainerStarted","Data":"f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4"} Apr 23 13:32:28.946873 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:28.946479 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:28.948687 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:28.948660 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qx8mp" event={"ID":"71016223-1429-40b3-94b8-0fa57c7f235a","Type":"ContainerStarted","Data":"1de093974895c717de7fd008c5aea98edddf240c2d15a5ec6bf1099e33750e16"} Apr 23 13:32:28.969358 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:28.968645 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" podStartSLOduration=51.968625063 podStartE2EDuration="51.968625063s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:28.968313941 +0000 UTC m=+51.955554820" watchObservedRunningTime="2026-04-23 13:32:28.968625063 +0000 UTC m=+51.955865944" Apr 23 13:32:31.957739 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:31.957689 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qx8mp" event={"ID":"71016223-1429-40b3-94b8-0fa57c7f235a","Type":"ContainerStarted","Data":"4cfe5229d55f99f66cc841c9058971933158f26749d2a4cb9a68c41325da03c5"} Apr 23 13:32:31.959074 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:31.959043 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-msdj4" event={"ID":"97134a37-e40a-4587-b02d-795b8a714cc0","Type":"ContainerStarted","Data":"30d7a79032add0973e19e714f61ac945d8b90f7c191cc968ac68b84abf7b8881"} Apr 23 13:32:31.960511 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:31.960483 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jlgh" event={"ID":"3ad71ca4-3e9d-4f24-866e-d3a822733344","Type":"ContainerStarted","Data":"25ce3a517be6fba77ff0966c28be6b1e92ab63423f6bb9109d0e45e2c22b4255"} Apr 23 13:32:31.960511 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:31.960508 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jlgh" event={"ID":"3ad71ca4-3e9d-4f24-866e-d3a822733344","Type":"ContainerStarted","Data":"6b404aa201751e007a0c4030ceeede30357a147377dda27137c85746cfa12151"} Apr 23 13:32:31.960648 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:31.960597 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:31.973057 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:31.973012 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qx8mp" podStartSLOduration=17.392479145 podStartE2EDuration="20.972998422s" podCreationTimestamp="2026-04-23 13:32:11 +0000 UTC" firstStartedPulling="2026-04-23 13:32:27.955748246 +0000 UTC m=+50.942989104" lastFinishedPulling="2026-04-23 13:32:31.536267509 +0000 UTC m=+54.523508381" observedRunningTime="2026-04-23 13:32:31.972660436 +0000 UTC m=+54.959901314" watchObservedRunningTime="2026-04-23 13:32:31.972998422 +0000 UTC m=+54.960239301" Apr 23 13:32:31.989605 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:31.989548 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8jlgh" podStartSLOduration=17.386590167 podStartE2EDuration="20.989532047s" podCreationTimestamp="2026-04-23 13:32:11 +0000 UTC" firstStartedPulling="2026-04-23 13:32:27.93058774 +0000 UTC m=+50.917828599" lastFinishedPulling="2026-04-23 13:32:31.533529618 +0000 UTC m=+54.520770479" observedRunningTime="2026-04-23 13:32:31.988910359 +0000 UTC m=+54.976151237" watchObservedRunningTime="2026-04-23 13:32:31.989532047 +0000 UTC m=+54.976772973" Apr 23 13:32:32.004737 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:32.004682 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-msdj4" podStartSLOduration=32.68066393 podStartE2EDuration="37.004667389s" podCreationTimestamp="2026-04-23 13:31:55 +0000 UTC" firstStartedPulling="2026-04-23 13:32:27.226729863 +0000 UTC m=+50.213970720" lastFinishedPulling="2026-04-23 13:32:31.550733309 +0000 UTC m=+54.537974179" observedRunningTime="2026-04-23 13:32:32.004125983 +0000 UTC m=+54.991366864" watchObservedRunningTime="2026-04-23 13:32:32.004667389 +0000 UTC m=+54.991908265" Apr 23 13:32:36.893576 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:36.893543 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cx2lr" Apr 23 13:32:38.673677 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.673644 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-dsbgl"] Apr 23 13:32:38.678114 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.678087 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-dsbgl" Apr 23 13:32:38.681336 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.681309 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 13:32:38.681458 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.681309 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 13:32:38.681458 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.681315 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-ldqnq\"" Apr 23 13:32:38.695154 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.695125 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-dsbgl"] Apr 23 13:32:38.698556 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.698525 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clnw8\" (UniqueName: \"kubernetes.io/projected/bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4-kube-api-access-clnw8\") pod \"downloads-6bcc868b7-dsbgl\" (UID: \"bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4\") " pod="openshift-console/downloads-6bcc868b7-dsbgl" Apr 23 13:32:38.786059 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.786028 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zp2vr"] Apr 23 13:32:38.789002 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.788985 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.791989 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.791964 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:32:38.792159 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.792139 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:32:38.792313 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.792294 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:32:38.792539 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.792527 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-785x5\"" Apr 23 13:32:38.792584 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.792528 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:32:38.798937 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.798901 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.799080 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.798957 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-data-volume\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.799080 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.798990 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.799080 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.799050 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clnw8\" (UniqueName: \"kubernetes.io/projected/bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4-kube-api-access-clnw8\") pod \"downloads-6bcc868b7-dsbgl\" (UID: \"bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4\") " pod="openshift-console/downloads-6bcc868b7-dsbgl" Apr 23 13:32:38.799201 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.799081 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-crio-socket\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.799201 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.799122 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m4fh\" (UniqueName: \"kubernetes.io/projected/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-kube-api-access-7m4fh\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.801658 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.801638 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zp2vr"] Apr 23 13:32:38.811396 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.811363 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clnw8\" (UniqueName: \"kubernetes.io/projected/bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4-kube-api-access-clnw8\") pod \"downloads-6bcc868b7-dsbgl\" (UID: \"bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4\") " pod="openshift-console/downloads-6bcc868b7-dsbgl" Apr 23 13:32:38.900334 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.900296 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.900334 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.900335 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-data-volume\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.900572 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.900355 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.900572 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.900378 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-crio-socket\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.900572 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.900398 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m4fh\" (UniqueName: \"kubernetes.io/projected/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-kube-api-access-7m4fh\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.900706 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.900618 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-crio-socket\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.900748 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.900702 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-data-volume\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.900990 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.900970 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.902799 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.902783 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.913324 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.913283 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m4fh\" (UniqueName: \"kubernetes.io/projected/ec60ed1a-66b8-4ec4-ab1d-101ecce247a4-kube-api-access-7m4fh\") pod \"insights-runtime-extractor-zp2vr\" (UID: \"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4\") " pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:38.987699 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:38.987600 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-dsbgl" Apr 23 13:32:39.098513 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.098464 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zp2vr" Apr 23 13:32:39.111583 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.111551 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-dsbgl"] Apr 23 13:32:39.116437 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:39.116410 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd3f2a9d_bf00_49e1_a88d_4a9fad71b3a4.slice/crio-42dba3fd608f971078ee9d2ecc54f1859b274f9bbb2fc8554874b595ff25deee WatchSource:0}: Error finding container 42dba3fd608f971078ee9d2ecc54f1859b274f9bbb2fc8554874b595ff25deee: Status 404 returned error can't find the container with id 42dba3fd608f971078ee9d2ecc54f1859b274f9bbb2fc8554874b595ff25deee Apr 23 13:32:39.226642 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.226611 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zp2vr"] Apr 23 13:32:39.229299 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:39.229260 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec60ed1a_66b8_4ec4_ab1d_101ecce247a4.slice/crio-ae0850ab4b6f74d52ebe7e372b4cacd224f3e31031fef8794aae5b850043ceea WatchSource:0}: Error finding container ae0850ab4b6f74d52ebe7e372b4cacd224f3e31031fef8794aae5b850043ceea: Status 404 returned error can't find the container with id ae0850ab4b6f74d52ebe7e372b4cacd224f3e31031fef8794aae5b850043ceea Apr 23 13:32:39.434639 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.434599 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6588fc4488-ggbnf"] Apr 23 13:32:39.437625 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.437602 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.440670 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.440642 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:32:39.440670 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.440646 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:32:39.441140 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.441121 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:32:39.441489 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.441472 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:32:39.444336 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.444076 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7ghqn\"" Apr 23 13:32:39.444336 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.444243 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:32:39.451858 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.451832 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6588fc4488-ggbnf"] Apr 23 13:32:39.505764 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.505670 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v28tm\" (UniqueName: \"kubernetes.io/projected/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-kube-api-access-v28tm\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.505764 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.505715 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-oauth-config\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.505764 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.505732 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-config\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.505764 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.505755 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-service-ca\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.506040 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.505882 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-serving-cert\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.506040 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.505936 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-oauth-serving-cert\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.606711 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.606664 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-service-ca\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.606711 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.606717 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-serving-cert\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.607015 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.606738 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-oauth-serving-cert\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.607015 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.606778 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v28tm\" (UniqueName: \"kubernetes.io/projected/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-kube-api-access-v28tm\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.607015 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.606803 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-oauth-config\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.607015 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.606819 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-config\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.607403 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.607377 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-service-ca\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.607542 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.607515 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-oauth-serving-cert\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.608015 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.607995 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-config\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.609446 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.609425 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-oauth-config\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.609529 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.609508 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-serving-cert\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.615868 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.615846 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v28tm\" (UniqueName: \"kubernetes.io/projected/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-kube-api-access-v28tm\") pod \"console-6588fc4488-ggbnf\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.737990 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.737949 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747"] Apr 23 13:32:39.741055 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.741034 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" Apr 23 13:32:39.743944 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.743905 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-6t9vr\"" Apr 23 13:32:39.744064 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.743976 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 13:32:39.747721 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.747343 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:39.757981 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.753645 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747"] Apr 23 13:32:39.808226 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.808119 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858931d3-ce64-4e42-adf5-a28423e3abd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zt747\" (UID: \"858931d3-ce64-4e42-adf5-a28423e3abd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" Apr 23 13:32:39.909716 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.909675 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858931d3-ce64-4e42-adf5-a28423e3abd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zt747\" (UID: \"858931d3-ce64-4e42-adf5-a28423e3abd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" Apr 23 13:32:39.909888 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:39.909853 2582 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 23 13:32:39.909978 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:39.909944 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858931d3-ce64-4e42-adf5-a28423e3abd3-tls-certificates podName:858931d3-ce64-4e42-adf5-a28423e3abd3 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:40.409899001 +0000 UTC m=+63.397139872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/858931d3-ce64-4e42-adf5-a28423e3abd3-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-zt747" (UID: "858931d3-ce64-4e42-adf5-a28423e3abd3") : secret "prometheus-operator-admission-webhook-tls" not found Apr 23 13:32:39.935612 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.935553 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6588fc4488-ggbnf"] Apr 23 13:32:39.985104 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.985067 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zp2vr" event={"ID":"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4","Type":"ContainerStarted","Data":"2fde26576df05732d1fb03ca55f80a28a97a7f3b47398b79b72fbd2bacb4ce0e"} Apr 23 13:32:39.985263 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.985112 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zp2vr" event={"ID":"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4","Type":"ContainerStarted","Data":"ae0850ab4b6f74d52ebe7e372b4cacd224f3e31031fef8794aae5b850043ceea"} Apr 23 13:32:39.986314 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:39.986280 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-dsbgl" event={"ID":"bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4","Type":"ContainerStarted","Data":"42dba3fd608f971078ee9d2ecc54f1859b274f9bbb2fc8554874b595ff25deee"} Apr 23 13:32:39.994857 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:39.994823 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3f8619b_4dbe_4fb4_8ef6_4f6d829574e4.slice/crio-05d94bcb5fc66fe3a175253c4617d6f8c00467abec5c486192b182e3e9be9a21 WatchSource:0}: Error finding container 05d94bcb5fc66fe3a175253c4617d6f8c00467abec5c486192b182e3e9be9a21: Status 404 returned error can't find the container with id 05d94bcb5fc66fe3a175253c4617d6f8c00467abec5c486192b182e3e9be9a21 Apr 23 13:32:40.414300 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:40.414262 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858931d3-ce64-4e42-adf5-a28423e3abd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zt747\" (UID: \"858931d3-ce64-4e42-adf5-a28423e3abd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" Apr 23 13:32:40.417291 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:40.417264 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858931d3-ce64-4e42-adf5-a28423e3abd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zt747\" (UID: \"858931d3-ce64-4e42-adf5-a28423e3abd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" Apr 23 13:32:40.653537 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:40.653466 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" Apr 23 13:32:40.826995 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:40.825541 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747"] Apr 23 13:32:40.831630 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:40.831594 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858931d3_ce64_4e42_adf5_a28423e3abd3.slice/crio-8791b687c88c5165d48b5376d2c81fa4745b0328c5cf1a90123995b792833ff2 WatchSource:0}: Error finding container 8791b687c88c5165d48b5376d2c81fa4745b0328c5cf1a90123995b792833ff2: Status 404 returned error can't find the container with id 8791b687c88c5165d48b5376d2c81fa4745b0328c5cf1a90123995b792833ff2 Apr 23 13:32:40.991233 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:40.991148 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zp2vr" event={"ID":"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4","Type":"ContainerStarted","Data":"75081b95daa515748d4f156d0dd50c6d92bb3b6952a6639f5afde728dcef671d"} Apr 23 13:32:40.992513 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:40.992486 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" event={"ID":"858931d3-ce64-4e42-adf5-a28423e3abd3","Type":"ContainerStarted","Data":"8791b687c88c5165d48b5376d2c81fa4745b0328c5cf1a90123995b792833ff2"} Apr 23 13:32:40.993849 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:40.993817 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6588fc4488-ggbnf" event={"ID":"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4","Type":"ContainerStarted","Data":"05d94bcb5fc66fe3a175253c4617d6f8c00467abec5c486192b182e3e9be9a21"} Apr 23 13:32:41.198693 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.197590 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6db8656c7d-n2krn"] Apr 23 13:32:41.202257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.202228 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.213106 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.212997 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db8656c7d-n2krn"] Apr 23 13:32:41.213572 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.213546 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 13:32:41.222173 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.222146 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cbnr\" (UniqueName: \"kubernetes.io/projected/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-kube-api-access-6cbnr\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.222478 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.222391 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-serving-cert\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.222559 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.222507 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-service-ca\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.222618 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.222582 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-oauth-config\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.222618 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.222613 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-trusted-ca-bundle\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.222719 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.222681 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-config\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.222770 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.222739 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-oauth-serving-cert\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.323876 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.323792 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cbnr\" (UniqueName: \"kubernetes.io/projected/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-kube-api-access-6cbnr\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.323876 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.323862 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-serving-cert\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.323876 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.323880 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-service-ca\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.324173 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.323904 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-oauth-config\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.324173 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.323943 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-trusted-ca-bundle\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.324173 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.323965 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-config\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.324173 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.323991 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-oauth-serving-cert\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.324698 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.324641 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-oauth-serving-cert\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.325226 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.325198 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-trusted-ca-bundle\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.325376 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.325200 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-config\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.325376 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.325248 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-service-ca\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.327830 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.327802 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-oauth-config\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.328891 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.328842 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-serving-cert\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.333844 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.333802 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cbnr\" (UniqueName: \"kubernetes.io/projected/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-kube-api-access-6cbnr\") pod \"console-6db8656c7d-n2krn\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.516674 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.516631 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:41.967635 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:41.967602 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8jlgh" Apr 23 13:32:42.334422 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.334325 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:42.337139 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.337112 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:42.348991 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.348957 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf879d65-39bb-4d9a-aa57-7d499026e167-metrics-certs\") pod \"network-metrics-daemon-8vwqm\" (UID: \"bf879d65-39bb-4d9a-aa57-7d499026e167\") " pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:42.393801 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.393765 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfgql\"" Apr 23 13:32:42.402129 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.402092 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vwqm" Apr 23 13:32:42.435535 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.435496 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:42.438534 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.438283 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:42.448609 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.448579 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:42.460218 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.460151 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n7sf\" (UniqueName: \"kubernetes.io/projected/70f0fbee-2214-4d11-8550-54879ecb58b1-kube-api-access-7n7sf\") pod \"network-check-target-dnqkh\" (UID: \"70f0fbee-2214-4d11-8550-54879ecb58b1\") " pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:42.687817 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.687765 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6f6d\"" Apr 23 13:32:42.696058 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:42.696021 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:43.901642 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:43.901552 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db8656c7d-n2krn"] Apr 23 13:32:43.924907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:43.924821 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dnqkh"] Apr 23 13:32:43.937884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:43.937844 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8vwqm"] Apr 23 13:32:44.005038 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:44.004940 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zp2vr" event={"ID":"ec60ed1a-66b8-4ec4-ab1d-101ecce247a4","Type":"ContainerStarted","Data":"389489ec241638fb0aaae955ed9206301d1ab06ab52986dc028c4af431135a9d"} Apr 23 13:32:44.006496 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:44.006471 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" event={"ID":"858931d3-ce64-4e42-adf5-a28423e3abd3","Type":"ContainerStarted","Data":"e847ee5cdc63f97d9b2c790016d9ecf073f50ae0681ecd1bdece84e48b888fd4"} Apr 23 13:32:44.006686 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:44.006665 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" Apr 23 13:32:44.013487 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:44.013462 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" Apr 23 13:32:44.030397 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:44.030190 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zp2vr" podStartSLOduration=1.602451217 podStartE2EDuration="6.030169869s" podCreationTimestamp="2026-04-23 13:32:38 +0000 UTC" firstStartedPulling="2026-04-23 13:32:39.290341345 +0000 UTC m=+62.277582204" lastFinishedPulling="2026-04-23 13:32:43.718059977 +0000 UTC m=+66.705300856" observedRunningTime="2026-04-23 13:32:44.029175999 +0000 UTC m=+67.016416879" watchObservedRunningTime="2026-04-23 13:32:44.030169869 +0000 UTC m=+67.017410750" Apr 23 13:32:44.048577 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:44.048523 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zt747" podStartSLOduration=2.16268053 podStartE2EDuration="5.048506095s" podCreationTimestamp="2026-04-23 13:32:39 +0000 UTC" firstStartedPulling="2026-04-23 13:32:40.833484533 +0000 UTC m=+63.820725407" lastFinishedPulling="2026-04-23 13:32:43.719310112 +0000 UTC m=+66.706550972" observedRunningTime="2026-04-23 13:32:44.048232295 +0000 UTC m=+67.035473176" watchObservedRunningTime="2026-04-23 13:32:44.048506095 +0000 UTC m=+67.035746975" Apr 23 13:32:44.077781 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:44.077740 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf1bd409_5535_49e4_9d2b_fa05f2d2fb5c.slice/crio-b5979459912bb4cbfefc72a4bd449469391a309988dbf5691a56965fd7d99a20 WatchSource:0}: Error finding container b5979459912bb4cbfefc72a4bd449469391a309988dbf5691a56965fd7d99a20: Status 404 returned error can't find the container with id b5979459912bb4cbfefc72a4bd449469391a309988dbf5691a56965fd7d99a20 Apr 23 13:32:44.078455 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:44.078417 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f0fbee_2214_4d11_8550_54879ecb58b1.slice/crio-d434330164fe24fc7714336d1d70d94fcae3e8f80e095c588f4d97d8e1f36832 WatchSource:0}: Error finding container d434330164fe24fc7714336d1d70d94fcae3e8f80e095c588f4d97d8e1f36832: Status 404 returned error can't find the container with id d434330164fe24fc7714336d1d70d94fcae3e8f80e095c588f4d97d8e1f36832 Apr 23 13:32:44.079877 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:44.079851 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf879d65_39bb_4d9a_aa57_7d499026e167.slice/crio-39c9d63ecaf19dbdd61397afcf677fed3ded30c026ce228ce3d2899d4de15c1e WatchSource:0}: Error finding container 39c9d63ecaf19dbdd61397afcf677fed3ded30c026ce228ce3d2899d4de15c1e: Status 404 returned error can't find the container with id 39c9d63ecaf19dbdd61397afcf677fed3ded30c026ce228ce3d2899d4de15c1e Apr 23 13:32:45.011779 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:45.011720 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vwqm" event={"ID":"bf879d65-39bb-4d9a-aa57-7d499026e167","Type":"ContainerStarted","Data":"39c9d63ecaf19dbdd61397afcf677fed3ded30c026ce228ce3d2899d4de15c1e"} Apr 23 13:32:45.013344 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:45.013296 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dnqkh" event={"ID":"70f0fbee-2214-4d11-8550-54879ecb58b1","Type":"ContainerStarted","Data":"d434330164fe24fc7714336d1d70d94fcae3e8f80e095c588f4d97d8e1f36832"} Apr 23 13:32:45.016275 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:45.015625 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db8656c7d-n2krn" event={"ID":"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c","Type":"ContainerStarted","Data":"61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571"} Apr 23 13:32:45.016275 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:45.015652 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db8656c7d-n2krn" event={"ID":"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c","Type":"ContainerStarted","Data":"b5979459912bb4cbfefc72a4bd449469391a309988dbf5691a56965fd7d99a20"} Apr 23 13:32:45.018558 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:45.018506 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6588fc4488-ggbnf" event={"ID":"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4","Type":"ContainerStarted","Data":"30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14"} Apr 23 13:32:45.049982 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:45.049075 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6db8656c7d-n2krn" podStartSLOduration=4.049052939 podStartE2EDuration="4.049052939s" podCreationTimestamp="2026-04-23 13:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:45.04810746 +0000 UTC m=+68.035348343" watchObservedRunningTime="2026-04-23 13:32:45.049052939 +0000 UTC m=+68.036293836" Apr 23 13:32:45.067698 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:45.065550 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6588fc4488-ggbnf" podStartSLOduration=1.950170134 podStartE2EDuration="6.065532332s" podCreationTimestamp="2026-04-23 13:32:39 +0000 UTC" firstStartedPulling="2026-04-23 13:32:39.996997277 +0000 UTC m=+62.984238133" lastFinishedPulling="2026-04-23 13:32:44.112359457 +0000 UTC m=+67.099600331" observedRunningTime="2026-04-23 13:32:45.065035987 +0000 UTC m=+68.052276867" watchObservedRunningTime="2026-04-23 13:32:45.065532332 +0000 UTC m=+68.052773212" Apr 23 13:32:47.027713 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:47.027674 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vwqm" event={"ID":"bf879d65-39bb-4d9a-aa57-7d499026e167","Type":"ContainerStarted","Data":"7c371deb780c7f5af2f4a6c629deb29557397ea0c39f7d7ad78323e41e4aa337"} Apr 23 13:32:47.027713 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:47.027721 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vwqm" event={"ID":"bf879d65-39bb-4d9a-aa57-7d499026e167","Type":"ContainerStarted","Data":"5facd6018396b3f68c30e8353bd02308924850329d486159a7a2548fdaf1fe41"} Apr 23 13:32:47.051150 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:47.050046 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8vwqm" podStartSLOduration=68.358543971 podStartE2EDuration="1m10.050025489s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:44.103343129 +0000 UTC m=+67.090583992" lastFinishedPulling="2026-04-23 13:32:45.794824648 +0000 UTC m=+68.782065510" observedRunningTime="2026-04-23 13:32:47.049216833 +0000 UTC m=+70.036457713" watchObservedRunningTime="2026-04-23 13:32:47.050025489 +0000 UTC m=+70.037266369" Apr 23 13:32:47.742688 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:47.742639 2582 patch_prober.go:28] interesting pod/image-registry-8664d9c4bc-prnck container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:32:47.742839 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:47.742707 2582 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" podUID="afe0d7fe-f3d4-4096-acee-7dd59ac92395" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:32:48.031945 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:48.031467 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dnqkh" event={"ID":"70f0fbee-2214-4d11-8550-54879ecb58b1","Type":"ContainerStarted","Data":"ca73e7b653d930a4b2a222055914b8b2ef74ed29b531a7bff5059df2fdbf43d5"} Apr 23 13:32:48.031945 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:48.031838 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:32:48.050053 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:48.049981 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dnqkh" podStartSLOduration=67.590104999 podStartE2EDuration="1m11.049959297s" podCreationTimestamp="2026-04-23 13:31:37 +0000 UTC" firstStartedPulling="2026-04-23 13:32:44.103568691 +0000 UTC m=+67.090809551" lastFinishedPulling="2026-04-23 13:32:47.563422921 +0000 UTC m=+70.550663849" observedRunningTime="2026-04-23 13:32:48.0484945 +0000 UTC m=+71.035735383" watchObservedRunningTime="2026-04-23 13:32:48.049959297 +0000 UTC m=+71.037200177" Apr 23 13:32:49.748309 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:49.748255 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:49.748309 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:49.748306 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:49.753726 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:49.753699 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:49.956616 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:49.956585 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:32:50.042983 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:50.042893 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:32:51.518114 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:51.517964 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:51.518114 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:51.518011 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:51.523230 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:51.523203 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:52.047725 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.047697 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:32:52.114753 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.114717 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6588fc4488-ggbnf"] Apr 23 13:32:52.195169 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.193746 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-llhfg"] Apr 23 13:32:52.200553 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.199130 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.202941 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.202886 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:32:52.203119 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.203037 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-75lnf\"" Apr 23 13:32:52.203275 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.203250 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:32:52.203333 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.203293 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:32:52.203398 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.203250 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:32:52.203556 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.203540 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:32:52.203808 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.203793 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:32:52.323636 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323536 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9476c852-09eb-4fdf-9fbc-50fc81e92780-metrics-client-ca\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.323636 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323589 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.323636 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323631 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-sys\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.323910 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323673 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-textfile\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.323910 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323703 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-accelerators-collector-config\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.323910 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323733 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxpvk\" (UniqueName: \"kubernetes.io/projected/9476c852-09eb-4fdf-9fbc-50fc81e92780-kube-api-access-gxpvk\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.323910 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323776 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-root\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.323910 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323799 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-tls\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.323910 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.323831 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-wtmp\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425000 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-root\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425048 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-tls\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425076 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-wtmp\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425101 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9476c852-09eb-4fdf-9fbc-50fc81e92780-metrics-client-ca\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425124 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425163 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-sys\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425200 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-textfile\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425230 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-accelerators-collector-config\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425263 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxpvk\" (UniqueName: \"kubernetes.io/projected/9476c852-09eb-4fdf-9fbc-50fc81e92780-kube-api-access-gxpvk\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.425696 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-root\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:52.425799 2582 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:52.425863 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-tls podName:9476c852-09eb-4fdf-9fbc-50fc81e92780 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:52.925845403 +0000 UTC m=+75.913086264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-tls") pod "node-exporter-llhfg" (UID: "9476c852-09eb-4fdf-9fbc-50fc81e92780") : secret "node-exporter-tls" not found Apr 23 13:32:52.426147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.426106 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-sys\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426852 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.426655 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-textfile\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.426852 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.426802 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-wtmp\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.427090 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.426949 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9476c852-09eb-4fdf-9fbc-50fc81e92780-metrics-client-ca\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.427356 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.427318 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-accelerators-collector-config\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.434277 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.434250 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.439797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.439758 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxpvk\" (UniqueName: \"kubernetes.io/projected/9476c852-09eb-4fdf-9fbc-50fc81e92780-kube-api-access-gxpvk\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.928257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.928215 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-tls\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:52.931053 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:52.931021 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9476c852-09eb-4fdf-9fbc-50fc81e92780-node-exporter-tls\") pod \"node-exporter-llhfg\" (UID: \"9476c852-09eb-4fdf-9fbc-50fc81e92780\") " pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:53.115068 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:53.115028 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-llhfg" Apr 23 13:32:56.365300 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:56.365268 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9476c852_09eb_4fdf_9fbc_50fc81e92780.slice/crio-0ad35e7e0382b57a11120ca59a7eac936e5f08b121f6f265a1d730cf0d896703 WatchSource:0}: Error finding container 0ad35e7e0382b57a11120ca59a7eac936e5f08b121f6f265a1d730cf0d896703: Status 404 returned error can't find the container with id 0ad35e7e0382b57a11120ca59a7eac936e5f08b121f6f265a1d730cf0d896703 Apr 23 13:32:56.947357 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:56.946036 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v"] Apr 23 13:32:56.950811 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:56.950785 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:32:56.953651 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:56.953308 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-x78k9\"" Apr 23 13:32:56.953651 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:56.953517 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 13:32:56.963876 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:56.963838 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v"] Apr 23 13:32:57.060982 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:57.060909 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-llhfg" event={"ID":"9476c852-09eb-4fdf-9fbc-50fc81e92780","Type":"ContainerStarted","Data":"0ad35e7e0382b57a11120ca59a7eac936e5f08b121f6f265a1d730cf0d896703"} Apr 23 13:32:57.063538 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:57.063503 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-dsbgl" event={"ID":"bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4","Type":"ContainerStarted","Data":"9dc33d808c3991d76458e17db53957cd8345661a57bec03c1aa96d9d473003d9"} Apr 23 13:32:57.064417 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:57.064379 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-dsbgl" Apr 23 13:32:57.071441 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:57.071404 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nvt2v\" (UID: \"f483be1c-073c-402c-b195-f1bfc3325dea\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:32:57.078764 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:57.078735 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-dsbgl" Apr 23 13:32:57.092362 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:57.092304 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-dsbgl" podStartSLOduration=1.688726186 podStartE2EDuration="19.092285243s" podCreationTimestamp="2026-04-23 13:32:38 +0000 UTC" firstStartedPulling="2026-04-23 13:32:39.119001907 +0000 UTC m=+62.106242764" lastFinishedPulling="2026-04-23 13:32:56.522560963 +0000 UTC m=+79.509801821" observedRunningTime="2026-04-23 13:32:57.090037858 +0000 UTC m=+80.077278738" watchObservedRunningTime="2026-04-23 13:32:57.092285243 +0000 UTC m=+80.079526122" Apr 23 13:32:57.172200 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:57.172162 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nvt2v\" (UID: \"f483be1c-073c-402c-b195-f1bfc3325dea\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:32:57.172342 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:57.172306 2582 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 13:32:57.172440 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:57.172385 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert podName:f483be1c-073c-402c-b195-f1bfc3325dea nodeName:}" failed. No retries permitted until 2026-04-23 13:32:57.672361522 +0000 UTC m=+80.659602398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-nvt2v" (UID: "f483be1c-073c-402c-b195-f1bfc3325dea") : secret "monitoring-plugin-cert" not found Apr 23 13:32:57.676344 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:57.676305 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nvt2v\" (UID: \"f483be1c-073c-402c-b195-f1bfc3325dea\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:32:57.676786 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:57.676430 2582 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 13:32:57.676786 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:32:57.676495 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert podName:f483be1c-073c-402c-b195-f1bfc3325dea nodeName:}" failed. No retries permitted until 2026-04-23 13:32:58.676474901 +0000 UTC m=+81.663715774 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-nvt2v" (UID: "f483be1c-073c-402c-b195-f1bfc3325dea") : secret "monitoring-plugin-cert" not found Apr 23 13:32:58.068073 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:58.067997 2582 generic.go:358] "Generic (PLEG): container finished" podID="9476c852-09eb-4fdf-9fbc-50fc81e92780" containerID="00e4cb7e16d91d38e687438fcd382746e98a00257981ac56b41f24b215f8779b" exitCode=0 Apr 23 13:32:58.068233 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:58.068086 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-llhfg" event={"ID":"9476c852-09eb-4fdf-9fbc-50fc81e92780","Type":"ContainerDied","Data":"00e4cb7e16d91d38e687438fcd382746e98a00257981ac56b41f24b215f8779b"} Apr 23 13:32:58.685680 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:58.685637 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nvt2v\" (UID: \"f483be1c-073c-402c-b195-f1bfc3325dea\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:32:58.688701 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:58.688667 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f483be1c-073c-402c-b195-f1bfc3325dea-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nvt2v\" (UID: \"f483be1c-073c-402c-b195-f1bfc3325dea\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:32:58.772761 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:58.772715 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:32:58.919562 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:58.919529 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v"] Apr 23 13:32:58.922971 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:32:58.922940 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf483be1c_073c_402c_b195_f1bfc3325dea.slice/crio-4ccb92fdb22df82345b69e793c148a075d586423e1e1ac5b18af23e21c44d24c WatchSource:0}: Error finding container 4ccb92fdb22df82345b69e793c148a075d586423e1e1ac5b18af23e21c44d24c: Status 404 returned error can't find the container with id 4ccb92fdb22df82345b69e793c148a075d586423e1e1ac5b18af23e21c44d24c Apr 23 13:32:59.074818 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:59.074707 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" event={"ID":"f483be1c-073c-402c-b195-f1bfc3325dea","Type":"ContainerStarted","Data":"4ccb92fdb22df82345b69e793c148a075d586423e1e1ac5b18af23e21c44d24c"} Apr 23 13:32:59.078265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:59.077770 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-llhfg" event={"ID":"9476c852-09eb-4fdf-9fbc-50fc81e92780","Type":"ContainerStarted","Data":"889c47b89367b3eb404503232d5fe57f59eff8a531c2cb186ef289e42bf7c924"} Apr 23 13:32:59.078265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:59.077813 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-llhfg" event={"ID":"9476c852-09eb-4fdf-9fbc-50fc81e92780","Type":"ContainerStarted","Data":"dcfbbaa55c175b07ae880d73bd7b813e555635d54d6a81ad0623b281edc3defa"} Apr 23 13:32:59.101732 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:32:59.101664 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-llhfg" podStartSLOduration=6.334220909 podStartE2EDuration="7.101646502s" podCreationTimestamp="2026-04-23 13:32:52 +0000 UTC" firstStartedPulling="2026-04-23 13:32:56.367444887 +0000 UTC m=+79.354685744" lastFinishedPulling="2026-04-23 13:32:57.134870472 +0000 UTC m=+80.122111337" observedRunningTime="2026-04-23 13:32:59.100153307 +0000 UTC m=+82.087394187" watchObservedRunningTime="2026-04-23 13:32:59.101646502 +0000 UTC m=+82.088887382" Apr 23 13:33:01.087447 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:01.087404 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" event={"ID":"f483be1c-073c-402c-b195-f1bfc3325dea","Type":"ContainerStarted","Data":"7977401cdf9be79d39313750e158b617eb88509a4cb303fbf1a19cf01f403bd5"} Apr 23 13:33:01.088011 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:01.087653 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:33:01.092770 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:01.092742 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" Apr 23 13:33:01.103897 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:01.103851 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nvt2v" podStartSLOduration=3.407038542 podStartE2EDuration="5.10383504s" podCreationTimestamp="2026-04-23 13:32:56 +0000 UTC" firstStartedPulling="2026-04-23 13:32:58.925097619 +0000 UTC m=+81.912338480" lastFinishedPulling="2026-04-23 13:33:00.621894102 +0000 UTC m=+83.609134978" observedRunningTime="2026-04-23 13:33:01.102634475 +0000 UTC m=+84.089875355" watchObservedRunningTime="2026-04-23 13:33:01.10383504 +0000 UTC m=+84.091075933" Apr 23 13:33:01.267297 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:01.267260 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8664d9c4bc-prnck"] Apr 23 13:33:08.211959 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:08.211906 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db8656c7d-n2krn"] Apr 23 13:33:17.144215 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.144152 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6588fc4488-ggbnf" podUID="c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" containerName="console" containerID="cri-o://30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14" gracePeriod=15 Apr 23 13:33:17.449720 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.449696 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6588fc4488-ggbnf_c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4/console/0.log" Apr 23 13:33:17.449837 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.449766 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:33:17.535809 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.535771 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-config\") pod \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " Apr 23 13:33:17.535809 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.535816 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-oauth-serving-cert\") pod \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " Apr 23 13:33:17.536080 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.535841 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v28tm\" (UniqueName: \"kubernetes.io/projected/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-kube-api-access-v28tm\") pod \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " Apr 23 13:33:17.536080 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.535996 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-serving-cert\") pod \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " Apr 23 13:33:17.536080 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.536040 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-oauth-config\") pod \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " Apr 23 13:33:17.536239 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.536081 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-service-ca\") pod \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\" (UID: \"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4\") " Apr 23 13:33:17.536291 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.536272 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-config" (OuterVolumeSpecName: "console-config") pod "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" (UID: "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:17.536336 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.536281 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" (UID: "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:17.536519 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.536495 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-service-ca" (OuterVolumeSpecName: "service-ca") pod "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" (UID: "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:17.538318 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.538291 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" (UID: "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:33:17.538412 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.538350 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" (UID: "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:33:17.538412 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.538383 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-kube-api-access-v28tm" (OuterVolumeSpecName: "kube-api-access-v28tm") pod "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" (UID: "c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4"). InnerVolumeSpecName "kube-api-access-v28tm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:33:17.637423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.637394 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v28tm\" (UniqueName: \"kubernetes.io/projected/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-kube-api-access-v28tm\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:17.637423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.637418 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-serving-cert\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:17.637423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.637428 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-oauth-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:17.637652 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.637437 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-service-ca\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:17.637652 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.637446 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-console-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:17.637652 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:17.637454 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4-oauth-serving-cert\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:18.136577 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.136551 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6588fc4488-ggbnf_c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4/console/0.log" Apr 23 13:33:18.136742 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.136593 2582 generic.go:358] "Generic (PLEG): container finished" podID="c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" containerID="30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14" exitCode=2 Apr 23 13:33:18.136742 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.136665 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6588fc4488-ggbnf" Apr 23 13:33:18.136817 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.136667 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6588fc4488-ggbnf" event={"ID":"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4","Type":"ContainerDied","Data":"30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14"} Apr 23 13:33:18.136817 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.136787 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6588fc4488-ggbnf" event={"ID":"c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4","Type":"ContainerDied","Data":"05d94bcb5fc66fe3a175253c4617d6f8c00467abec5c486192b182e3e9be9a21"} Apr 23 13:33:18.136817 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.136809 2582 scope.go:117] "RemoveContainer" containerID="30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14" Apr 23 13:33:18.144704 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.144684 2582 scope.go:117] "RemoveContainer" containerID="30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14" Apr 23 13:33:18.145019 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:33:18.144988 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14\": container with ID starting with 30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14 not found: ID does not exist" containerID="30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14" Apr 23 13:33:18.145081 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.145026 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14"} err="failed to get container status \"30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14\": rpc error: code = NotFound desc = could not find container \"30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14\": container with ID starting with 30cc7616b5aad7c52b598ae5ca01a3aeab68d114e9be4a2329afc8211a64dc14 not found: ID does not exist" Apr 23 13:33:18.152627 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.152603 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6588fc4488-ggbnf"] Apr 23 13:33:18.156423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:18.156403 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6588fc4488-ggbnf"] Apr 23 13:33:19.038340 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:19.038310 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dnqkh" Apr 23 13:33:19.669775 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:19.669728 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" path="/var/lib/kubelet/pods/c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4/volumes" Apr 23 13:33:26.289594 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.289551 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" podUID="afe0d7fe-f3d4-4096-acee-7dd59ac92395" containerName="registry" containerID="cri-o://f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4" gracePeriod=30 Apr 23 13:33:26.544518 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.544443 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:33:26.708943 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.708888 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") pod \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " Apr 23 13:33:26.709143 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.708961 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afe0d7fe-f3d4-4096-acee-7dd59ac92395-ca-trust-extracted\") pod \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " Apr 23 13:33:26.709143 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.708990 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-certificates\") pod \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " Apr 23 13:33:26.709143 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.709009 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-bound-sa-token\") pod \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " Apr 23 13:33:26.709143 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.709026 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w25t\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-kube-api-access-2w25t\") pod \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " Apr 23 13:33:26.709143 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.709049 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-installation-pull-secrets\") pod \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " Apr 23 13:33:26.709143 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.709081 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-image-registry-private-configuration\") pod \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " Apr 23 13:33:26.709438 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.709227 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-trusted-ca\") pod \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\" (UID: \"afe0d7fe-f3d4-4096-acee-7dd59ac92395\") " Apr 23 13:33:26.709705 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.709564 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "afe0d7fe-f3d4-4096-acee-7dd59ac92395" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:26.709854 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.709780 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "afe0d7fe-f3d4-4096-acee-7dd59ac92395" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:26.712064 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.712034 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "afe0d7fe-f3d4-4096-acee-7dd59ac92395" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:33:26.712064 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.712042 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-kube-api-access-2w25t" (OuterVolumeSpecName: "kube-api-access-2w25t") pod "afe0d7fe-f3d4-4096-acee-7dd59ac92395" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395"). InnerVolumeSpecName "kube-api-access-2w25t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:33:26.712204 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.712071 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "afe0d7fe-f3d4-4096-acee-7dd59ac92395" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:33:26.712204 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.712116 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "afe0d7fe-f3d4-4096-acee-7dd59ac92395" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:33:26.712204 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.712153 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "afe0d7fe-f3d4-4096-acee-7dd59ac92395" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:33:26.720227 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.720192 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe0d7fe-f3d4-4096-acee-7dd59ac92395-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "afe0d7fe-f3d4-4096-acee-7dd59ac92395" (UID: "afe0d7fe-f3d4-4096-acee-7dd59ac92395"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:33:26.810552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.810457 2582 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-installation-pull-secrets\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:26.810552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.810492 2582 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afe0d7fe-f3d4-4096-acee-7dd59ac92395-image-registry-private-configuration\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:26.810552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.810505 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-trusted-ca\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:26.810552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.810514 2582 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:26.810552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.810523 2582 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afe0d7fe-f3d4-4096-acee-7dd59ac92395-ca-trust-extracted\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:26.810552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.810532 2582 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afe0d7fe-f3d4-4096-acee-7dd59ac92395-registry-certificates\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:26.810552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.810541 2582 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-bound-sa-token\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:26.810552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:26.810549 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2w25t\" (UniqueName: \"kubernetes.io/projected/afe0d7fe-f3d4-4096-acee-7dd59ac92395-kube-api-access-2w25t\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:27.162512 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.162481 2582 generic.go:358] "Generic (PLEG): container finished" podID="afe0d7fe-f3d4-4096-acee-7dd59ac92395" containerID="f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4" exitCode=0 Apr 23 13:33:27.162711 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.162527 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" event={"ID":"afe0d7fe-f3d4-4096-acee-7dd59ac92395","Type":"ContainerDied","Data":"f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4"} Apr 23 13:33:27.162711 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.162550 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" event={"ID":"afe0d7fe-f3d4-4096-acee-7dd59ac92395","Type":"ContainerDied","Data":"dd88015d29a3ff1a533f2e0b00fe913e1081dceea48abad658192b9b7d2c39cd"} Apr 23 13:33:27.162711 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.162564 2582 scope.go:117] "RemoveContainer" containerID="f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4" Apr 23 13:33:27.162711 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.162562 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8664d9c4bc-prnck" Apr 23 13:33:27.170783 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.170763 2582 scope.go:117] "RemoveContainer" containerID="f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4" Apr 23 13:33:27.171062 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:33:27.171037 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4\": container with ID starting with f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4 not found: ID does not exist" containerID="f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4" Apr 23 13:33:27.171164 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.171072 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4"} err="failed to get container status \"f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4\": rpc error: code = NotFound desc = could not find container \"f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4\": container with ID starting with f2137186d4e7281d39208d262334ac85def5151e484a36926e75791a12f59cf4 not found: ID does not exist" Apr 23 13:33:27.189747 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.189717 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8664d9c4bc-prnck"] Apr 23 13:33:27.194323 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.194295 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8664d9c4bc-prnck"] Apr 23 13:33:27.669337 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:27.669304 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe0d7fe-f3d4-4096-acee-7dd59ac92395" path="/var/lib/kubelet/pods/afe0d7fe-f3d4-4096-acee-7dd59ac92395/volumes" Apr 23 13:33:33.231506 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.231437 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6db8656c7d-n2krn" podUID="bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" containerName="console" containerID="cri-o://61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571" gracePeriod=15 Apr 23 13:33:33.502587 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.502563 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db8656c7d-n2krn_bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c/console/0.log" Apr 23 13:33:33.502739 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.502624 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:33:33.663076 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663044 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-trusted-ca-bundle\") pod \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " Apr 23 13:33:33.663076 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663083 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-oauth-serving-cert\") pod \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " Apr 23 13:33:33.663290 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663120 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cbnr\" (UniqueName: \"kubernetes.io/projected/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-kube-api-access-6cbnr\") pod \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " Apr 23 13:33:33.663290 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663146 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-config\") pod \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " Apr 23 13:33:33.663379 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663289 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-serving-cert\") pod \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " Apr 23 13:33:33.663379 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663372 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-service-ca\") pod \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " Apr 23 13:33:33.663456 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663401 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-oauth-config\") pod \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\" (UID: \"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c\") " Apr 23 13:33:33.663570 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663545 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" (UID: "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:33.663652 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663571 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-config" (OuterVolumeSpecName: "console-config") pod "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" (UID: "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:33.663652 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663563 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" (UID: "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:33.663814 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.663788 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-service-ca" (OuterVolumeSpecName: "service-ca") pod "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" (UID: "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:33:33.665509 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.665489 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-kube-api-access-6cbnr" (OuterVolumeSpecName: "kube-api-access-6cbnr") pod "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" (UID: "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c"). InnerVolumeSpecName "kube-api-access-6cbnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:33:33.665587 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.665552 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" (UID: "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:33:33.665587 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.665565 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" (UID: "bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:33:33.763975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.763869 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-service-ca\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:33.763975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.763900 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-oauth-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:33.763975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.763910 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-trusted-ca-bundle\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:33.763975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.763944 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-oauth-serving-cert\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:33.763975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.763953 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cbnr\" (UniqueName: \"kubernetes.io/projected/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-kube-api-access-6cbnr\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:33.763975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.763962 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:33.763975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:33.763970 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c-console-serving-cert\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:33:34.182218 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.182190 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db8656c7d-n2krn_bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c/console/0.log" Apr 23 13:33:34.182387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.182232 2582 generic.go:358] "Generic (PLEG): container finished" podID="bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" containerID="61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571" exitCode=2 Apr 23 13:33:34.182387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.182299 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db8656c7d-n2krn" event={"ID":"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c","Type":"ContainerDied","Data":"61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571"} Apr 23 13:33:34.182387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.182306 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db8656c7d-n2krn" Apr 23 13:33:34.182387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.182334 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db8656c7d-n2krn" event={"ID":"bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c","Type":"ContainerDied","Data":"b5979459912bb4cbfefc72a4bd449469391a309988dbf5691a56965fd7d99a20"} Apr 23 13:33:34.182387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.182356 2582 scope.go:117] "RemoveContainer" containerID="61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571" Apr 23 13:33:34.190479 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.190456 2582 scope.go:117] "RemoveContainer" containerID="61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571" Apr 23 13:33:34.190752 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:33:34.190733 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571\": container with ID starting with 61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571 not found: ID does not exist" containerID="61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571" Apr 23 13:33:34.190817 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.190757 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571"} err="failed to get container status \"61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571\": rpc error: code = NotFound desc = could not find container \"61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571\": container with ID starting with 61633964640e4a83349b557a8fa118a85d19b87485135c2116a1bd8c32564571 not found: ID does not exist" Apr 23 13:33:34.202866 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.202823 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db8656c7d-n2krn"] Apr 23 13:33:34.210465 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:34.210436 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6db8656c7d-n2krn"] Apr 23 13:33:35.669792 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:33:35.669755 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" path="/var/lib/kubelet/pods/bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c/volumes" Apr 23 13:34:20.109862 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.109822 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6685bfb586-fbtcv"] Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110090 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afe0d7fe-f3d4-4096-acee-7dd59ac92395" containerName="registry" Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110101 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe0d7fe-f3d4-4096-acee-7dd59ac92395" containerName="registry" Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110118 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" containerName="console" Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110127 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" containerName="console" Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110137 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" containerName="console" Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110142 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" containerName="console" Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110183 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="afe0d7fe-f3d4-4096-acee-7dd59ac92395" containerName="registry" Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110190 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3f8619b-4dbe-4fb4-8ef6-4f6d829574e4" containerName="console" Apr 23 13:34:20.110342 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.110197 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf1bd409-5535-49e4-9d2b-fa05f2d2fb5c" containerName="console" Apr 23 13:34:20.112803 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.112786 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.116158 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.116125 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:34:20.117198 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.117177 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:34:20.117307 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.117279 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7ghqn\"" Apr 23 13:34:20.117307 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.117284 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:34:20.117425 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.117297 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:34:20.117425 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.117328 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:34:20.123709 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.123688 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 13:34:20.131905 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.131873 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6685bfb586-fbtcv"] Apr 23 13:34:20.196502 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.196455 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcgjb\" (UniqueName: \"kubernetes.io/projected/2100842e-6d0c-45ab-816a-fac52cb5e7ad-kube-api-access-dcgjb\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.196502 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.196501 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-oauth-config\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.196731 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.196569 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-service-ca\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.196731 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.196624 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-trusted-ca-bundle\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.196731 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.196658 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-serving-cert\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.196731 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.196695 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-oauth-serving-cert\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.196731 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.196726 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-config\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.297325 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.297291 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcgjb\" (UniqueName: \"kubernetes.io/projected/2100842e-6d0c-45ab-816a-fac52cb5e7ad-kube-api-access-dcgjb\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.297325 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.297327 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-oauth-config\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.297516 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.297346 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-service-ca\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.297516 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.297368 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-trusted-ca-bundle\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.297516 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.297399 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-serving-cert\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.297516 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.297422 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-oauth-serving-cert\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.297516 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.297447 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-config\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.298314 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.298281 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-config\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.298426 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.298281 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-service-ca\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.298426 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.298323 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-oauth-serving-cert\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.298426 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.298399 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-trusted-ca-bundle\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.300127 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.300104 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-serving-cert\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.300218 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.300111 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-oauth-config\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.307468 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.307446 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcgjb\" (UniqueName: \"kubernetes.io/projected/2100842e-6d0c-45ab-816a-fac52cb5e7ad-kube-api-access-dcgjb\") pod \"console-6685bfb586-fbtcv\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.422653 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.422607 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:20.558225 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:20.558142 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6685bfb586-fbtcv"] Apr 23 13:34:20.561736 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:34:20.561707 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2100842e_6d0c_45ab_816a_fac52cb5e7ad.slice/crio-febfc221f445a14d4befe15d7b6cc9d2033e675d3f3a7f8c712ee87b640324ce WatchSource:0}: Error finding container febfc221f445a14d4befe15d7b6cc9d2033e675d3f3a7f8c712ee87b640324ce: Status 404 returned error can't find the container with id febfc221f445a14d4befe15d7b6cc9d2033e675d3f3a7f8c712ee87b640324ce Apr 23 13:34:21.316504 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:21.316467 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6685bfb586-fbtcv" event={"ID":"2100842e-6d0c-45ab-816a-fac52cb5e7ad","Type":"ContainerStarted","Data":"88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135"} Apr 23 13:34:21.316504 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:21.316506 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6685bfb586-fbtcv" event={"ID":"2100842e-6d0c-45ab-816a-fac52cb5e7ad","Type":"ContainerStarted","Data":"febfc221f445a14d4befe15d7b6cc9d2033e675d3f3a7f8c712ee87b640324ce"} Apr 23 13:34:21.334703 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:21.334651 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6685bfb586-fbtcv" podStartSLOduration=1.334634754 podStartE2EDuration="1.334634754s" podCreationTimestamp="2026-04-23 13:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:21.333154441 +0000 UTC m=+164.320395320" watchObservedRunningTime="2026-04-23 13:34:21.334634754 +0000 UTC m=+164.321875633" Apr 23 13:34:27.966162 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:27.966122 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6685bfb586-fbtcv"] Apr 23 13:34:27.998070 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:27.998031 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c4c8956bf-rxvzc"] Apr 23 13:34:28.001055 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.001038 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.010541 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.010516 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4c8956bf-rxvzc"] Apr 23 13:34:28.154057 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.154000 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-oauth-serving-cert\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.154057 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.154061 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-console-config\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.154289 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.154093 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-trusted-ca-bundle\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.154289 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.154118 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-serving-cert\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.154289 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.154133 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-service-ca\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.154289 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.154158 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndmkc\" (UniqueName: \"kubernetes.io/projected/52710e8b-d832-4822-bcc8-f588e9125e9a-kube-api-access-ndmkc\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.154289 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.154237 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-oauth-config\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.255304 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.255237 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndmkc\" (UniqueName: \"kubernetes.io/projected/52710e8b-d832-4822-bcc8-f588e9125e9a-kube-api-access-ndmkc\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.255304 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.255270 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-oauth-config\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.255304 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.255305 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-oauth-serving-cert\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.255465 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.255336 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-console-config\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.255511 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.255475 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-trusted-ca-bundle\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.255549 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.255521 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-serving-cert\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.255598 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.255553 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-service-ca\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.256052 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.256022 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-console-config\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.256171 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.256056 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-oauth-serving-cert\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.256252 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.256236 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-service-ca\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.256596 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.256575 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-trusted-ca-bundle\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.258026 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.258002 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-serving-cert\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.258128 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.258104 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-oauth-config\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.263730 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.263707 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndmkc\" (UniqueName: \"kubernetes.io/projected/52710e8b-d832-4822-bcc8-f588e9125e9a-kube-api-access-ndmkc\") pod \"console-6c4c8956bf-rxvzc\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.309755 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.309705 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:28.434551 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:28.434513 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4c8956bf-rxvzc"] Apr 23 13:34:28.438741 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:34:28.438708 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52710e8b_d832_4822_bcc8_f588e9125e9a.slice/crio-ebd4e054137f423988fa44952f541aa9f5a6356396c8ba4754369655d72e0ee2 WatchSource:0}: Error finding container ebd4e054137f423988fa44952f541aa9f5a6356396c8ba4754369655d72e0ee2: Status 404 returned error can't find the container with id ebd4e054137f423988fa44952f541aa9f5a6356396c8ba4754369655d72e0ee2 Apr 23 13:34:29.338035 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:29.337996 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4c8956bf-rxvzc" event={"ID":"52710e8b-d832-4822-bcc8-f588e9125e9a","Type":"ContainerStarted","Data":"eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64"} Apr 23 13:34:29.338035 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:29.338039 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4c8956bf-rxvzc" event={"ID":"52710e8b-d832-4822-bcc8-f588e9125e9a","Type":"ContainerStarted","Data":"ebd4e054137f423988fa44952f541aa9f5a6356396c8ba4754369655d72e0ee2"} Apr 23 13:34:29.369308 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:29.369256 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c4c8956bf-rxvzc" podStartSLOduration=2.369238246 podStartE2EDuration="2.369238246s" podCreationTimestamp="2026-04-23 13:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:29.366262059 +0000 UTC m=+172.353502939" watchObservedRunningTime="2026-04-23 13:34:29.369238246 +0000 UTC m=+172.356479125" Apr 23 13:34:30.423127 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:30.423080 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:38.310747 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:38.310685 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:38.310747 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:38.310752 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:38.315398 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:38.315377 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:38.362958 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:38.362908 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:34:52.984828 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:52.984763 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6685bfb586-fbtcv" podUID="2100842e-6d0c-45ab-816a-fac52cb5e7ad" containerName="console" containerID="cri-o://88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135" gracePeriod=15 Apr 23 13:34:53.217023 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.216996 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6685bfb586-fbtcv_2100842e-6d0c-45ab-816a-fac52cb5e7ad/console/0.log" Apr 23 13:34:53.217144 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.217057 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:53.332861 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.332764 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-serving-cert\") pod \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " Apr 23 13:34:53.332861 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.332830 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-oauth-serving-cert\") pod \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " Apr 23 13:34:53.333099 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.332866 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-service-ca\") pod \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " Apr 23 13:34:53.333099 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.332974 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-trusted-ca-bundle\") pod \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " Apr 23 13:34:53.333099 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.333027 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-config\") pod \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " Apr 23 13:34:53.333099 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.333062 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-oauth-config\") pod \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " Apr 23 13:34:53.333363 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.333119 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcgjb\" (UniqueName: \"kubernetes.io/projected/2100842e-6d0c-45ab-816a-fac52cb5e7ad-kube-api-access-dcgjb\") pod \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\" (UID: \"2100842e-6d0c-45ab-816a-fac52cb5e7ad\") " Apr 23 13:34:53.333363 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.333328 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-service-ca" (OuterVolumeSpecName: "service-ca") pod "2100842e-6d0c-45ab-816a-fac52cb5e7ad" (UID: "2100842e-6d0c-45ab-816a-fac52cb5e7ad"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:53.333363 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.333328 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2100842e-6d0c-45ab-816a-fac52cb5e7ad" (UID: "2100842e-6d0c-45ab-816a-fac52cb5e7ad"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:53.333510 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.333424 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2100842e-6d0c-45ab-816a-fac52cb5e7ad" (UID: "2100842e-6d0c-45ab-816a-fac52cb5e7ad"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:53.333510 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.333426 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-config" (OuterVolumeSpecName: "console-config") pod "2100842e-6d0c-45ab-816a-fac52cb5e7ad" (UID: "2100842e-6d0c-45ab-816a-fac52cb5e7ad"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:53.335152 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.335114 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2100842e-6d0c-45ab-816a-fac52cb5e7ad" (UID: "2100842e-6d0c-45ab-816a-fac52cb5e7ad"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:34:53.335394 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.335372 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2100842e-6d0c-45ab-816a-fac52cb5e7ad-kube-api-access-dcgjb" (OuterVolumeSpecName: "kube-api-access-dcgjb") pod "2100842e-6d0c-45ab-816a-fac52cb5e7ad" (UID: "2100842e-6d0c-45ab-816a-fac52cb5e7ad"). InnerVolumeSpecName "kube-api-access-dcgjb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:53.335471 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.335406 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2100842e-6d0c-45ab-816a-fac52cb5e7ad" (UID: "2100842e-6d0c-45ab-816a-fac52cb5e7ad"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:34:53.401738 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.401708 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6685bfb586-fbtcv_2100842e-6d0c-45ab-816a-fac52cb5e7ad/console/0.log" Apr 23 13:34:53.401953 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.401747 2582 generic.go:358] "Generic (PLEG): container finished" podID="2100842e-6d0c-45ab-816a-fac52cb5e7ad" containerID="88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135" exitCode=2 Apr 23 13:34:53.401953 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.401824 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6685bfb586-fbtcv" event={"ID":"2100842e-6d0c-45ab-816a-fac52cb5e7ad","Type":"ContainerDied","Data":"88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135"} Apr 23 13:34:53.401953 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.401832 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6685bfb586-fbtcv" Apr 23 13:34:53.401953 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.401852 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6685bfb586-fbtcv" event={"ID":"2100842e-6d0c-45ab-816a-fac52cb5e7ad","Type":"ContainerDied","Data":"febfc221f445a14d4befe15d7b6cc9d2033e675d3f3a7f8c712ee87b640324ce"} Apr 23 13:34:53.401953 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.401868 2582 scope.go:117] "RemoveContainer" containerID="88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135" Apr 23 13:34:53.410373 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.410337 2582 scope.go:117] "RemoveContainer" containerID="88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135" Apr 23 13:34:53.410646 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:34:53.410624 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135\": container with ID starting with 88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135 not found: ID does not exist" containerID="88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135" Apr 23 13:34:53.410713 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.410657 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135"} err="failed to get container status \"88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135\": rpc error: code = NotFound desc = could not find container \"88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135\": container with ID starting with 88b405c0bc660676cabe9f71defef79add2cab7120d6a888d72177dffce36135 not found: ID does not exist" Apr 23 13:34:53.423489 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.423460 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6685bfb586-fbtcv"] Apr 23 13:34:53.427261 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.427240 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6685bfb586-fbtcv"] Apr 23 13:34:53.433992 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.433972 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-oauth-serving-cert\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:34:53.434086 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.433994 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-service-ca\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:34:53.434086 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.434008 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-trusted-ca-bundle\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:34:53.434086 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.434021 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:34:53.434086 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.434030 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-oauth-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:34:53.434086 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.434039 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dcgjb\" (UniqueName: \"kubernetes.io/projected/2100842e-6d0c-45ab-816a-fac52cb5e7ad-kube-api-access-dcgjb\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:34:53.434086 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.434048 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2100842e-6d0c-45ab-816a-fac52cb5e7ad-console-serving-cert\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:34:53.669450 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:34:53.669420 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2100842e-6d0c-45ab-816a-fac52cb5e7ad" path="/var/lib/kubelet/pods/2100842e-6d0c-45ab-816a-fac52cb5e7ad/volumes" Apr 23 13:35:17.839517 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.839477 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n"] Apr 23 13:35:17.839954 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.839711 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2100842e-6d0c-45ab-816a-fac52cb5e7ad" containerName="console" Apr 23 13:35:17.839954 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.839721 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="2100842e-6d0c-45ab-816a-fac52cb5e7ad" containerName="console" Apr 23 13:35:17.839954 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.839784 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="2100842e-6d0c-45ab-816a-fac52cb5e7ad" containerName="console" Apr 23 13:35:17.844488 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.844465 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:17.847343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.847323 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-556nh\"" Apr 23 13:35:17.848485 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.848467 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:35:17.848575 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.848517 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:35:17.859687 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:17.859657 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n"] Apr 23 13:35:18.007794 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.007753 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.007794 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.007795 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.008073 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.007901 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8jcc\" (UniqueName: \"kubernetes.io/projected/c557d8fb-91a4-4a51-8ee8-8365ef08370e-kube-api-access-x8jcc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.108410 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.108312 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.108410 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.108355 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.108410 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.108391 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8jcc\" (UniqueName: \"kubernetes.io/projected/c557d8fb-91a4-4a51-8ee8-8365ef08370e-kube-api-access-x8jcc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.108751 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.108728 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.108818 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.108792 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.118495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.118459 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8jcc\" (UniqueName: \"kubernetes.io/projected/c557d8fb-91a4-4a51-8ee8-8365ef08370e-kube-api-access-x8jcc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.153880 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.153829 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:18.273873 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.273848 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n"] Apr 23 13:35:18.276442 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:35:18.276410 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc557d8fb_91a4_4a51_8ee8_8365ef08370e.slice/crio-77821ff937982f83815584315a788a1117d5710197c74ab82cf8274c5ca2c091 WatchSource:0}: Error finding container 77821ff937982f83815584315a788a1117d5710197c74ab82cf8274c5ca2c091: Status 404 returned error can't find the container with id 77821ff937982f83815584315a788a1117d5710197c74ab82cf8274c5ca2c091 Apr 23 13:35:18.471005 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:18.470954 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" event={"ID":"c557d8fb-91a4-4a51-8ee8-8365ef08370e","Type":"ContainerStarted","Data":"77821ff937982f83815584315a788a1117d5710197c74ab82cf8274c5ca2c091"} Apr 23 13:35:23.486837 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:23.486797 2582 generic.go:358] "Generic (PLEG): container finished" podID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerID="91334db20675b59fe605dcd88c902ac8ab05793f08f1f834c40940a32a9e5f18" exitCode=0 Apr 23 13:35:23.487385 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:23.486860 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" event={"ID":"c557d8fb-91a4-4a51-8ee8-8365ef08370e","Type":"ContainerDied","Data":"91334db20675b59fe605dcd88c902ac8ab05793f08f1f834c40940a32a9e5f18"} Apr 23 13:35:27.671500 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.671462 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8"] Apr 23 13:35:27.681575 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.681546 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" Apr 23 13:35:27.683136 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.683109 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8"] Apr 23 13:35:27.685336 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.685304 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 13:35:27.685454 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.685331 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 13:35:27.685454 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.685412 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 13:35:27.686494 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.686474 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-cbq8q\"" Apr 23 13:35:27.686602 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.686502 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 13:35:27.772826 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.772781 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd"] Apr 23 13:35:27.777958 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.777913 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.782235 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.782203 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k22kl\" (UniqueName: \"kubernetes.io/projected/10a42769-c84e-4aa9-8d7b-e21a713c2375-kube-api-access-k22kl\") pod \"managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8\" (UID: \"10a42769-c84e-4aa9-8d7b-e21a713c2375\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" Apr 23 13:35:27.782379 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.782304 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10a42769-c84e-4aa9-8d7b-e21a713c2375-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8\" (UID: \"10a42769-c84e-4aa9-8d7b-e21a713c2375\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" Apr 23 13:35:27.785008 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.784982 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 13:35:27.802520 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.802487 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd"] Apr 23 13:35:27.809161 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.809135 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g"] Apr 23 13:35:27.813888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.813869 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.816531 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.816505 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 13:35:27.817232 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.817103 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 13:35:27.817512 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.817486 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 13:35:27.818219 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.818198 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 13:35:27.825488 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.825464 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g"] Apr 23 13:35:27.883673 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.883639 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.883673 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.883673 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbwxp\" (UniqueName: \"kubernetes.io/projected/41401f6f-43cb-41cc-83de-65963136748c-kube-api-access-cbwxp\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.883881 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.883699 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/41401f6f-43cb-41cc-83de-65963136748c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.883881 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.883748 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-hub\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.883881 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.883829 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-klusterlet-config\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.883881 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.883879 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10a42769-c84e-4aa9-8d7b-e21a713c2375-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8\" (UID: \"10a42769-c84e-4aa9-8d7b-e21a713c2375\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" Apr 23 13:35:27.884101 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.883901 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-ca\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.884101 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.883978 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-tmp\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.884101 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.884024 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hprpg\" (UniqueName: \"kubernetes.io/projected/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-kube-api-access-hprpg\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.884101 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.884061 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.884220 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.884129 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k22kl\" (UniqueName: \"kubernetes.io/projected/10a42769-c84e-4aa9-8d7b-e21a713c2375-kube-api-access-k22kl\") pod \"managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8\" (UID: \"10a42769-c84e-4aa9-8d7b-e21a713c2375\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" Apr 23 13:35:27.886833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.886802 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10a42769-c84e-4aa9-8d7b-e21a713c2375-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8\" (UID: \"10a42769-c84e-4aa9-8d7b-e21a713c2375\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" Apr 23 13:35:27.904389 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.904358 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k22kl\" (UniqueName: \"kubernetes.io/projected/10a42769-c84e-4aa9-8d7b-e21a713c2375-kube-api-access-k22kl\") pod \"managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8\" (UID: \"10a42769-c84e-4aa9-8d7b-e21a713c2375\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" Apr 23 13:35:27.984571 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984542 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-ca\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.984704 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984581 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-tmp\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.984704 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984613 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hprpg\" (UniqueName: \"kubernetes.io/projected/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-kube-api-access-hprpg\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.984704 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984647 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.984704 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984686 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.984905 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984709 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbwxp\" (UniqueName: \"kubernetes.io/projected/41401f6f-43cb-41cc-83de-65963136748c-kube-api-access-cbwxp\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.984905 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984747 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/41401f6f-43cb-41cc-83de-65963136748c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.984905 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984781 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-hub\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.984905 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.984832 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-klusterlet-config\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.985135 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.985061 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-tmp\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.986159 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.986131 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/41401f6f-43cb-41cc-83de-65963136748c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.987385 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.987359 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-ca\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.987474 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.987369 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.987648 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.987628 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.987908 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.987892 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-klusterlet-config\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.988007 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.987946 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/41401f6f-43cb-41cc-83de-65963136748c-hub\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:27.994083 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.994059 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hprpg\" (UniqueName: \"kubernetes.io/projected/1cfc6b61-16b8-46ed-a36d-12ef5e322ea5-kube-api-access-hprpg\") pod \"klusterlet-addon-workmgr-8546b78797-6kvkd\" (UID: \"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:27.994394 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:27.994378 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbwxp\" (UniqueName: \"kubernetes.io/projected/41401f6f-43cb-41cc-83de-65963136748c-kube-api-access-cbwxp\") pod \"cluster-proxy-proxy-agent-5965f6d669-ptm4g\" (UID: \"41401f6f-43cb-41cc-83de-65963136748c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:28.002621 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.002599 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" Apr 23 13:35:28.089900 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.089868 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:28.124730 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.124701 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" Apr 23 13:35:28.132422 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.132397 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8"] Apr 23 13:35:28.150290 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:35:28.150251 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10a42769_c84e_4aa9_8d7b_e21a713c2375.slice/crio-736c867b9468a05208254195972789f063c7555b9cd1a082958f4548a61c6393 WatchSource:0}: Error finding container 736c867b9468a05208254195972789f063c7555b9cd1a082958f4548a61c6393: Status 404 returned error can't find the container with id 736c867b9468a05208254195972789f063c7555b9cd1a082958f4548a61c6393 Apr 23 13:35:28.220330 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.220297 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd"] Apr 23 13:35:28.225077 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:35:28.225047 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cfc6b61_16b8_46ed_a36d_12ef5e322ea5.slice/crio-7a7dd3d299822091900f7ada851d30c4ac9c32642b5f5961a21b77a45988273f WatchSource:0}: Error finding container 7a7dd3d299822091900f7ada851d30c4ac9c32642b5f5961a21b77a45988273f: Status 404 returned error can't find the container with id 7a7dd3d299822091900f7ada851d30c4ac9c32642b5f5961a21b77a45988273f Apr 23 13:35:28.266521 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.266498 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g"] Apr 23 13:35:28.268956 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:35:28.268933 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41401f6f_43cb_41cc_83de_65963136748c.slice/crio-5765f1b614cbc106572052ab5f6285a6556b57f3c5533aa125973a08039f1f9f WatchSource:0}: Error finding container 5765f1b614cbc106572052ab5f6285a6556b57f3c5533aa125973a08039f1f9f: Status 404 returned error can't find the container with id 5765f1b614cbc106572052ab5f6285a6556b57f3c5533aa125973a08039f1f9f Apr 23 13:35:28.502015 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.501898 2582 generic.go:358] "Generic (PLEG): container finished" podID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerID="d2f3956a26586eda93a1af097d46ed29594b6d1d8c740783765d311429b178b3" exitCode=0 Apr 23 13:35:28.502015 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.501996 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" event={"ID":"c557d8fb-91a4-4a51-8ee8-8365ef08370e","Type":"ContainerDied","Data":"d2f3956a26586eda93a1af097d46ed29594b6d1d8c740783765d311429b178b3"} Apr 23 13:35:28.503172 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.503148 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" event={"ID":"41401f6f-43cb-41cc-83de-65963136748c","Type":"ContainerStarted","Data":"5765f1b614cbc106572052ab5f6285a6556b57f3c5533aa125973a08039f1f9f"} Apr 23 13:35:28.504268 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.504243 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" event={"ID":"10a42769-c84e-4aa9-8d7b-e21a713c2375","Type":"ContainerStarted","Data":"736c867b9468a05208254195972789f063c7555b9cd1a082958f4548a61c6393"} Apr 23 13:35:28.505228 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:28.505204 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" event={"ID":"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5","Type":"ContainerStarted","Data":"7a7dd3d299822091900f7ada851d30c4ac9c32642b5f5961a21b77a45988273f"} Apr 23 13:35:33.524791 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:33.524748 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" event={"ID":"41401f6f-43cb-41cc-83de-65963136748c","Type":"ContainerStarted","Data":"3d8ba757c39b6c1800c9cfa0655e6a9f437b10cd1ded297f74d927e507efa17c"} Apr 23 13:35:33.526285 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:33.526259 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" event={"ID":"10a42769-c84e-4aa9-8d7b-e21a713c2375","Type":"ContainerStarted","Data":"a4e3b96b39b036352544efbec54c9e3a546ae9a8739a68fa4dc6ddbca6f5b773"} Apr 23 13:35:33.541793 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:33.541727 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cc8b45c6c-mn2z8" podStartSLOduration=1.943499166 podStartE2EDuration="6.541707198s" podCreationTimestamp="2026-04-23 13:35:27 +0000 UTC" firstStartedPulling="2026-04-23 13:35:28.152829289 +0000 UTC m=+231.140070148" lastFinishedPulling="2026-04-23 13:35:32.751037324 +0000 UTC m=+235.738278180" observedRunningTime="2026-04-23 13:35:33.541591766 +0000 UTC m=+236.528832645" watchObservedRunningTime="2026-04-23 13:35:33.541707198 +0000 UTC m=+236.528948082" Apr 23 13:35:35.534102 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:35.534010 2582 generic.go:358] "Generic (PLEG): container finished" podID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerID="a78b51f0a951a5d7ad777b0abacf6015cfdcfe350f72f94a079a1b6aed91c22f" exitCode=0 Apr 23 13:35:35.534505 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:35.534103 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" event={"ID":"c557d8fb-91a4-4a51-8ee8-8365ef08370e","Type":"ContainerDied","Data":"a78b51f0a951a5d7ad777b0abacf6015cfdcfe350f72f94a079a1b6aed91c22f"} Apr 23 13:35:36.774903 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.774881 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:36.863887 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.863864 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-bundle\") pod \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " Apr 23 13:35:36.864012 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.863935 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8jcc\" (UniqueName: \"kubernetes.io/projected/c557d8fb-91a4-4a51-8ee8-8365ef08370e-kube-api-access-x8jcc\") pod \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " Apr 23 13:35:36.864012 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.863996 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-util\") pod \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\" (UID: \"c557d8fb-91a4-4a51-8ee8-8365ef08370e\") " Apr 23 13:35:36.864429 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.864396 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-bundle" (OuterVolumeSpecName: "bundle") pod "c557d8fb-91a4-4a51-8ee8-8365ef08370e" (UID: "c557d8fb-91a4-4a51-8ee8-8365ef08370e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:36.866187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.866167 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c557d8fb-91a4-4a51-8ee8-8365ef08370e-kube-api-access-x8jcc" (OuterVolumeSpecName: "kube-api-access-x8jcc") pod "c557d8fb-91a4-4a51-8ee8-8365ef08370e" (UID: "c557d8fb-91a4-4a51-8ee8-8365ef08370e"). InnerVolumeSpecName "kube-api-access-x8jcc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:36.867893 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.867864 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-util" (OuterVolumeSpecName: "util") pod "c557d8fb-91a4-4a51-8ee8-8365ef08370e" (UID: "c557d8fb-91a4-4a51-8ee8-8365ef08370e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:36.964735 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.964702 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-util\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:35:36.964735 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.964734 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c557d8fb-91a4-4a51-8ee8-8365ef08370e-bundle\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:35:36.964906 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:36.964748 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8jcc\" (UniqueName: \"kubernetes.io/projected/c557d8fb-91a4-4a51-8ee8-8365ef08370e-kube-api-access-x8jcc\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.541294 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:37.541253 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" event={"ID":"c557d8fb-91a4-4a51-8ee8-8365ef08370e","Type":"ContainerDied","Data":"77821ff937982f83815584315a788a1117d5710197c74ab82cf8274c5ca2c091"} Apr 23 13:35:37.541482 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:37.541299 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77821ff937982f83815584315a788a1117d5710197c74ab82cf8274c5ca2c091" Apr 23 13:35:37.541482 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:37.541342 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfg5n" Apr 23 13:35:37.543225 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:37.543185 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" event={"ID":"41401f6f-43cb-41cc-83de-65963136748c","Type":"ContainerStarted","Data":"3c9b45c4c83e3685e3d16861eb8c84267153aefd7e69e33dbd64fbd754718c8e"} Apr 23 13:35:37.543225 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:37.543222 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" event={"ID":"41401f6f-43cb-41cc-83de-65963136748c","Type":"ContainerStarted","Data":"82f8e946d36e7dc691f90a172f54b9d6cc80e934cbb3815ed406c70836839d29"} Apr 23 13:35:37.562723 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:37.562676 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5965f6d669-ptm4g" podStartSLOduration=2.008042157 podStartE2EDuration="10.562662556s" podCreationTimestamp="2026-04-23 13:35:27 +0000 UTC" firstStartedPulling="2026-04-23 13:35:28.270637788 +0000 UTC m=+231.257878645" lastFinishedPulling="2026-04-23 13:35:36.825258173 +0000 UTC m=+239.812499044" observedRunningTime="2026-04-23 13:35:37.56050308 +0000 UTC m=+240.547743958" watchObservedRunningTime="2026-04-23 13:35:37.562662556 +0000 UTC m=+240.549903494" Apr 23 13:35:44.569547 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:44.569510 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" event={"ID":"1cfc6b61-16b8-46ed-a36d-12ef5e322ea5","Type":"ContainerStarted","Data":"8919c43c5da629eaa59e098aa47de78568cb04a4f33f81a37c04487d947e4f3a"} Apr 23 13:35:44.570044 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:44.569712 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:44.571316 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:44.571297 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" Apr 23 13:35:44.588211 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:35:44.588165 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8546b78797-6kvkd" podStartSLOduration=1.373444064 podStartE2EDuration="17.588151192s" podCreationTimestamp="2026-04-23 13:35:27 +0000 UTC" firstStartedPulling="2026-04-23 13:35:28.226910071 +0000 UTC m=+231.214150928" lastFinishedPulling="2026-04-23 13:35:44.441617199 +0000 UTC m=+247.428858056" observedRunningTime="2026-04-23 13:35:44.586974466 +0000 UTC m=+247.574215343" watchObservedRunningTime="2026-04-23 13:35:44.588151192 +0000 UTC m=+247.575392071" Apr 23 13:36:37.536751 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:37.536717 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:36:37.537300 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:37.537132 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:36:37.541367 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:37.541348 2582 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:36:49.972145 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.972114 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-hsvj8"] Apr 23 13:36:49.974464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.972357 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerName="extract" Apr 23 13:36:49.974464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.972368 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerName="extract" Apr 23 13:36:49.974464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.972378 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerName="pull" Apr 23 13:36:49.974464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.972383 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerName="pull" Apr 23 13:36:49.974464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.972398 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerName="util" Apr 23 13:36:49.974464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.972405 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerName="util" Apr 23 13:36:49.974464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.972450 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="c557d8fb-91a4-4a51-8ee8-8365ef08370e" containerName="extract" Apr 23 13:36:49.975205 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.975189 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:49.977954 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.977909 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:36:49.978835 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.978814 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 13:36:49.978835 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.978827 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qx2w\"" Apr 23 13:36:49.979069 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.978841 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:36:49.987513 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.987485 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hsvj8"] Apr 23 13:36:49.999288 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.999261 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1148e33c-d2ca-4cdd-b81a-604ba117dd37-data\") pod \"seaweedfs-86cc847c5c-hsvj8\" (UID: \"1148e33c-d2ca-4cdd-b81a-604ba117dd37\") " pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:49.999406 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:49.999344 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqn5\" (UniqueName: \"kubernetes.io/projected/1148e33c-d2ca-4cdd-b81a-604ba117dd37-kube-api-access-8wqn5\") pod \"seaweedfs-86cc847c5c-hsvj8\" (UID: \"1148e33c-d2ca-4cdd-b81a-604ba117dd37\") " pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:50.100234 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:50.100199 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1148e33c-d2ca-4cdd-b81a-604ba117dd37-data\") pod \"seaweedfs-86cc847c5c-hsvj8\" (UID: \"1148e33c-d2ca-4cdd-b81a-604ba117dd37\") " pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:50.100398 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:50.100253 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqn5\" (UniqueName: \"kubernetes.io/projected/1148e33c-d2ca-4cdd-b81a-604ba117dd37-kube-api-access-8wqn5\") pod \"seaweedfs-86cc847c5c-hsvj8\" (UID: \"1148e33c-d2ca-4cdd-b81a-604ba117dd37\") " pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:50.100576 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:50.100555 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1148e33c-d2ca-4cdd-b81a-604ba117dd37-data\") pod \"seaweedfs-86cc847c5c-hsvj8\" (UID: \"1148e33c-d2ca-4cdd-b81a-604ba117dd37\") " pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:50.108765 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:50.108745 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqn5\" (UniqueName: \"kubernetes.io/projected/1148e33c-d2ca-4cdd-b81a-604ba117dd37-kube-api-access-8wqn5\") pod \"seaweedfs-86cc847c5c-hsvj8\" (UID: \"1148e33c-d2ca-4cdd-b81a-604ba117dd37\") " pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:50.286172 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:50.286072 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:50.404597 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:50.404567 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hsvj8"] Apr 23 13:36:50.407765 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:36:50.407739 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1148e33c_d2ca_4cdd_b81a_604ba117dd37.slice/crio-53329e610d30971bb87b762fe75f5acecbb7de9bdf0117bb2fab986b63ca9788 WatchSource:0}: Error finding container 53329e610d30971bb87b762fe75f5acecbb7de9bdf0117bb2fab986b63ca9788: Status 404 returned error can't find the container with id 53329e610d30971bb87b762fe75f5acecbb7de9bdf0117bb2fab986b63ca9788 Apr 23 13:36:50.408981 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:50.408966 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:36:50.740357 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:50.740305 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hsvj8" event={"ID":"1148e33c-d2ca-4cdd-b81a-604ba117dd37","Type":"ContainerStarted","Data":"53329e610d30971bb87b762fe75f5acecbb7de9bdf0117bb2fab986b63ca9788"} Apr 23 13:36:53.748821 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:53.748788 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hsvj8" event={"ID":"1148e33c-d2ca-4cdd-b81a-604ba117dd37","Type":"ContainerStarted","Data":"6a211a7d793bb0fdcdcff2c5495602359aeda9a71149a39e5097d5b94478cdc5"} Apr 23 13:36:53.749198 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:53.748909 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:36:53.764265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:53.764211 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-hsvj8" podStartSLOduration=1.4859418149999999 podStartE2EDuration="4.764192906s" podCreationTimestamp="2026-04-23 13:36:49 +0000 UTC" firstStartedPulling="2026-04-23 13:36:50.409080058 +0000 UTC m=+313.396320918" lastFinishedPulling="2026-04-23 13:36:53.687331149 +0000 UTC m=+316.674572009" observedRunningTime="2026-04-23 13:36:53.762652441 +0000 UTC m=+316.749893320" watchObservedRunningTime="2026-04-23 13:36:53.764192906 +0000 UTC m=+316.751433787" Apr 23 13:36:59.753418 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:36:59.753390 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-hsvj8" Apr 23 13:39:10.850001 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:10.849907 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h"] Apr 23 13:39:10.852883 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:10.852867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:10.855234 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:10.855208 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 23 13:39:10.855360 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:10.855321 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 23 13:39:10.860139 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:10.860117 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h"] Apr 23 13:39:10.934297 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:10.934266 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/e40efd34-7dd8-4a14-985b-f10d4925bd6e-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:10.934469 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:10.934312 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd7n4\" (UniqueName: \"kubernetes.io/projected/e40efd34-7dd8-4a14-985b-f10d4925bd6e-kube-api-access-kd7n4\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:10.934469 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:10.934340 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e40efd34-7dd8-4a14-985b-f10d4925bd6e-data\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:11.035726 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:11.035684 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/e40efd34-7dd8-4a14-985b-f10d4925bd6e-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:11.035894 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:11.035739 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd7n4\" (UniqueName: \"kubernetes.io/projected/e40efd34-7dd8-4a14-985b-f10d4925bd6e-kube-api-access-kd7n4\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:11.035894 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:11.035773 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e40efd34-7dd8-4a14-985b-f10d4925bd6e-data\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:11.036237 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:11.036213 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e40efd34-7dd8-4a14-985b-f10d4925bd6e-data\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:11.038312 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:11.038293 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/e40efd34-7dd8-4a14-985b-f10d4925bd6e-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:11.043830 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:11.043810 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd7n4\" (UniqueName: \"kubernetes.io/projected/e40efd34-7dd8-4a14-985b-f10d4925bd6e-kube-api-access-kd7n4\") pod \"seaweedfs-tls-serving-7fd5766db9-xhn2h\" (UID: \"e40efd34-7dd8-4a14-985b-f10d4925bd6e\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:11.163012 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:11.162988 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" Apr 23 13:39:11.285544 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:11.285511 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h"] Apr 23 13:39:11.288825 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:39:11.288790 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40efd34_7dd8_4a14_985b_f10d4925bd6e.slice/crio-83730360f32369873f75cc789b4b6b72a6886a240819adfd96a1b4fb4c4dc027 WatchSource:0}: Error finding container 83730360f32369873f75cc789b4b6b72a6886a240819adfd96a1b4fb4c4dc027: Status 404 returned error can't find the container with id 83730360f32369873f75cc789b4b6b72a6886a240819adfd96a1b4fb4c4dc027 Apr 23 13:39:12.118991 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:12.118953 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" event={"ID":"e40efd34-7dd8-4a14-985b-f10d4925bd6e","Type":"ContainerStarted","Data":"9e4a822a56e31ba76327e5fa8c2a605932a9b0864ac9dfe92c1a2d7a60db5d80"} Apr 23 13:39:12.118991 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:12.118994 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" event={"ID":"e40efd34-7dd8-4a14-985b-f10d4925bd6e","Type":"ContainerStarted","Data":"83730360f32369873f75cc789b4b6b72a6886a240819adfd96a1b4fb4c4dc027"} Apr 23 13:39:12.134512 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:12.134460 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-xhn2h" podStartSLOduration=1.87202172 podStartE2EDuration="2.134445664s" podCreationTimestamp="2026-04-23 13:39:10 +0000 UTC" firstStartedPulling="2026-04-23 13:39:11.289974254 +0000 UTC m=+454.277215111" lastFinishedPulling="2026-04-23 13:39:11.552398198 +0000 UTC m=+454.539639055" observedRunningTime="2026-04-23 13:39:12.133406683 +0000 UTC m=+455.120647585" watchObservedRunningTime="2026-04-23 13:39:12.134445664 +0000 UTC m=+455.121686540" Apr 23 13:39:20.463716 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.463682 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8c7b6b7b6-7gbmt"] Apr 23 13:39:20.466696 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.466679 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.481014 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.480984 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8c7b6b7b6-7gbmt"] Apr 23 13:39:20.608431 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.608396 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-oauth-serving-cert\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.608431 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.608433 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-serving-cert\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.608657 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.608458 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-oauth-config\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.608657 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.608496 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkjn6\" (UniqueName: \"kubernetes.io/projected/3e36b65e-d6eb-4549-a16c-34dfa89283d2-kube-api-access-nkjn6\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.608657 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.608517 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-config\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.608657 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.608536 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-service-ca\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.608657 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.608554 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-trusted-ca-bundle\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.709394 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.709355 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-oauth-config\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.709394 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.709396 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkjn6\" (UniqueName: \"kubernetes.io/projected/3e36b65e-d6eb-4549-a16c-34dfa89283d2-kube-api-access-nkjn6\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.709639 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.709422 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-config\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.709639 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.709450 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-service-ca\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.709639 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.709478 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-trusted-ca-bundle\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.709796 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.709659 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-oauth-serving-cert\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.709796 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.709727 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-serving-cert\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.710276 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.710228 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-config\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.710412 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.710341 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-service-ca\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.710481 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.710426 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-oauth-serving-cert\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.710538 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.710494 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e36b65e-d6eb-4549-a16c-34dfa89283d2-trusted-ca-bundle\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.712063 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.712040 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-oauth-config\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.712177 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.712160 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e36b65e-d6eb-4549-a16c-34dfa89283d2-console-serving-cert\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.717497 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.717445 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkjn6\" (UniqueName: \"kubernetes.io/projected/3e36b65e-d6eb-4549-a16c-34dfa89283d2-kube-api-access-nkjn6\") pod \"console-8c7b6b7b6-7gbmt\" (UID: \"3e36b65e-d6eb-4549-a16c-34dfa89283d2\") " pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.777178 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.777131 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:20.903590 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:20.903553 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8c7b6b7b6-7gbmt"] Apr 23 13:39:20.906716 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:39:20.906686 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e36b65e_d6eb_4549_a16c_34dfa89283d2.slice/crio-9d2758886a69801d346f4b01db3845b053683c05d4c99017ae08f7430d424c15 WatchSource:0}: Error finding container 9d2758886a69801d346f4b01db3845b053683c05d4c99017ae08f7430d424c15: Status 404 returned error can't find the container with id 9d2758886a69801d346f4b01db3845b053683c05d4c99017ae08f7430d424c15 Apr 23 13:39:21.145788 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:21.145754 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8c7b6b7b6-7gbmt" event={"ID":"3e36b65e-d6eb-4549-a16c-34dfa89283d2","Type":"ContainerStarted","Data":"ff1a88a42392d4e8fba67b1fd3f147dbe744dcd8485de0c5e0ffb319ad17648a"} Apr 23 13:39:21.145788 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:21.145794 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8c7b6b7b6-7gbmt" event={"ID":"3e36b65e-d6eb-4549-a16c-34dfa89283d2","Type":"ContainerStarted","Data":"9d2758886a69801d346f4b01db3845b053683c05d4c99017ae08f7430d424c15"} Apr 23 13:39:21.163281 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:21.163059 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8c7b6b7b6-7gbmt" podStartSLOduration=1.163043726 podStartE2EDuration="1.163043726s" podCreationTimestamp="2026-04-23 13:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:39:21.162375583 +0000 UTC m=+464.149616463" watchObservedRunningTime="2026-04-23 13:39:21.163043726 +0000 UTC m=+464.150284606" Apr 23 13:39:30.777789 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:30.777740 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:30.777789 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:30.777794 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:30.782274 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:30.782255 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:31.178328 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:31.178301 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8c7b6b7b6-7gbmt" Apr 23 13:39:31.220979 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:31.220947 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c4c8956bf-rxvzc"] Apr 23 13:39:56.240093 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.240031 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c4c8956bf-rxvzc" podUID="52710e8b-d832-4822-bcc8-f588e9125e9a" containerName="console" containerID="cri-o://eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64" gracePeriod=15 Apr 23 13:39:56.476913 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.476886 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c4c8956bf-rxvzc_52710e8b-d832-4822-bcc8-f588e9125e9a/console/0.log" Apr 23 13:39:56.477048 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.476967 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:39:56.477214 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477195 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-serving-cert\") pod \"52710e8b-d832-4822-bcc8-f588e9125e9a\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " Apr 23 13:39:56.477258 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477240 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-oauth-serving-cert\") pod \"52710e8b-d832-4822-bcc8-f588e9125e9a\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " Apr 23 13:39:56.477303 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477287 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-trusted-ca-bundle\") pod \"52710e8b-d832-4822-bcc8-f588e9125e9a\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " Apr 23 13:39:56.477343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477320 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-console-config\") pod \"52710e8b-d832-4822-bcc8-f588e9125e9a\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " Apr 23 13:39:56.477388 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477349 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-oauth-config\") pod \"52710e8b-d832-4822-bcc8-f588e9125e9a\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " Apr 23 13:39:56.477388 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477381 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-service-ca\") pod \"52710e8b-d832-4822-bcc8-f588e9125e9a\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " Apr 23 13:39:56.477485 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477410 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndmkc\" (UniqueName: \"kubernetes.io/projected/52710e8b-d832-4822-bcc8-f588e9125e9a-kube-api-access-ndmkc\") pod \"52710e8b-d832-4822-bcc8-f588e9125e9a\" (UID: \"52710e8b-d832-4822-bcc8-f588e9125e9a\") " Apr 23 13:39:56.477698 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477672 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "52710e8b-d832-4822-bcc8-f588e9125e9a" (UID: "52710e8b-d832-4822-bcc8-f588e9125e9a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:39:56.477777 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477730 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "52710e8b-d832-4822-bcc8-f588e9125e9a" (UID: "52710e8b-d832-4822-bcc8-f588e9125e9a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:39:56.477777 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477743 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-console-config" (OuterVolumeSpecName: "console-config") pod "52710e8b-d832-4822-bcc8-f588e9125e9a" (UID: "52710e8b-d832-4822-bcc8-f588e9125e9a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:39:56.477777 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.477752 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-service-ca" (OuterVolumeSpecName: "service-ca") pod "52710e8b-d832-4822-bcc8-f588e9125e9a" (UID: "52710e8b-d832-4822-bcc8-f588e9125e9a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:39:56.479424 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.479395 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "52710e8b-d832-4822-bcc8-f588e9125e9a" (UID: "52710e8b-d832-4822-bcc8-f588e9125e9a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:39:56.479549 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.479528 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "52710e8b-d832-4822-bcc8-f588e9125e9a" (UID: "52710e8b-d832-4822-bcc8-f588e9125e9a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:39:56.479711 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.479689 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52710e8b-d832-4822-bcc8-f588e9125e9a-kube-api-access-ndmkc" (OuterVolumeSpecName: "kube-api-access-ndmkc") pod "52710e8b-d832-4822-bcc8-f588e9125e9a" (UID: "52710e8b-d832-4822-bcc8-f588e9125e9a"). InnerVolumeSpecName "kube-api-access-ndmkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:39:56.578860 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.578783 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-serving-cert\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:39:56.578860 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.578808 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-oauth-serving-cert\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:39:56.578860 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.578818 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-trusted-ca-bundle\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:39:56.578860 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.578826 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-console-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:39:56.578860 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.578836 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52710e8b-d832-4822-bcc8-f588e9125e9a-console-oauth-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:39:56.578860 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.578845 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52710e8b-d832-4822-bcc8-f588e9125e9a-service-ca\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:39:56.578860 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:56.578854 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndmkc\" (UniqueName: \"kubernetes.io/projected/52710e8b-d832-4822-bcc8-f588e9125e9a-kube-api-access-ndmkc\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:39:57.246432 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.246403 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c4c8956bf-rxvzc_52710e8b-d832-4822-bcc8-f588e9125e9a/console/0.log" Apr 23 13:39:57.246894 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.246443 2582 generic.go:358] "Generic (PLEG): container finished" podID="52710e8b-d832-4822-bcc8-f588e9125e9a" containerID="eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64" exitCode=2 Apr 23 13:39:57.246894 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.246528 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4c8956bf-rxvzc" Apr 23 13:39:57.246894 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.246534 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4c8956bf-rxvzc" event={"ID":"52710e8b-d832-4822-bcc8-f588e9125e9a","Type":"ContainerDied","Data":"eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64"} Apr 23 13:39:57.246894 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.246573 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4c8956bf-rxvzc" event={"ID":"52710e8b-d832-4822-bcc8-f588e9125e9a","Type":"ContainerDied","Data":"ebd4e054137f423988fa44952f541aa9f5a6356396c8ba4754369655d72e0ee2"} Apr 23 13:39:57.246894 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.246588 2582 scope.go:117] "RemoveContainer" containerID="eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64" Apr 23 13:39:57.255044 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.255028 2582 scope.go:117] "RemoveContainer" containerID="eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64" Apr 23 13:39:57.255286 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:39:57.255271 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64\": container with ID starting with eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64 not found: ID does not exist" containerID="eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64" Apr 23 13:39:57.255323 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.255294 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64"} err="failed to get container status \"eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64\": rpc error: code = NotFound desc = could not find container \"eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64\": container with ID starting with eafbcca19425e55f4f797088ca8b6d02c9d9d411c89f254a8e0ff56854cfea64 not found: ID does not exist" Apr 23 13:39:57.267219 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.267193 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c4c8956bf-rxvzc"] Apr 23 13:39:57.272064 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.272035 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c4c8956bf-rxvzc"] Apr 23 13:39:57.669175 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:39:57.669144 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52710e8b-d832-4822-bcc8-f588e9125e9a" path="/var/lib/kubelet/pods/52710e8b-d832-4822-bcc8-f588e9125e9a/volumes" Apr 23 13:41:37.554967 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:41:37.554940 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:41:37.555902 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:41:37.555881 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:42:38.601444 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.601406 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg"] Apr 23 13:42:38.601956 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.601652 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52710e8b-d832-4822-bcc8-f588e9125e9a" containerName="console" Apr 23 13:42:38.601956 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.601662 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="52710e8b-d832-4822-bcc8-f588e9125e9a" containerName="console" Apr 23 13:42:38.601956 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.601714 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="52710e8b-d832-4822-bcc8-f588e9125e9a" containerName="console" Apr 23 13:42:38.604549 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.604531 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:38.607257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.607234 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 13:42:38.607257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.607244 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:42:38.607418 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.607265 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 23 13:42:38.607418 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.607314 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:42:38.607599 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.607242 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 23 13:42:38.613250 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.613229 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg"] Apr 23 13:42:38.740343 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.740310 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02fc9ade-c9a8-4107-ae3a-c7241d49c136-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:38.740534 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.740360 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:38.740534 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.740379 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxbq\" (UniqueName: \"kubernetes.io/projected/02fc9ade-c9a8-4107-ae3a-c7241d49c136-kube-api-access-rsxbq\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:38.841227 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.841198 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxbq\" (UniqueName: \"kubernetes.io/projected/02fc9ade-c9a8-4107-ae3a-c7241d49c136-kube-api-access-rsxbq\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:38.841386 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.841247 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02fc9ade-c9a8-4107-ae3a-c7241d49c136-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:38.841386 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.841283 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:38.841386 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:42:38.841364 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 23 13:42:38.841529 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:42:38.841434 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls podName:02fc9ade-c9a8-4107-ae3a-c7241d49c136 nodeName:}" failed. No retries permitted until 2026-04-23 13:42:39.341417396 +0000 UTC m=+662.328658252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-wrwkg" (UID: "02fc9ade-c9a8-4107-ae3a-c7241d49c136") : secret "message-dumper-predictor-serving-cert" not found Apr 23 13:42:38.842068 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.842045 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02fc9ade-c9a8-4107-ae3a-c7241d49c136-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:38.849999 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:38.849976 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxbq\" (UniqueName: \"kubernetes.io/projected/02fc9ade-c9a8-4107-ae3a-c7241d49c136-kube-api-access-rsxbq\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:39.344639 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:39.344602 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:39.347236 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:39.347208 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-wrwkg\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:39.515354 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:39.515318 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:39.637427 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:39.637395 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg"] Apr 23 13:42:39.641283 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:42:39.641258 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02fc9ade_c9a8_4107_ae3a_c7241d49c136.slice/crio-ca53a77531823245c0474f30fe8d28f7ca495467960f8e0e06125c56dad44e44 WatchSource:0}: Error finding container ca53a77531823245c0474f30fe8d28f7ca495467960f8e0e06125c56dad44e44: Status 404 returned error can't find the container with id ca53a77531823245c0474f30fe8d28f7ca495467960f8e0e06125c56dad44e44 Apr 23 13:42:39.642893 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:39.642876 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:42:39.689185 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:39.689155 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" event={"ID":"02fc9ade-c9a8-4107-ae3a-c7241d49c136","Type":"ContainerStarted","Data":"ca53a77531823245c0474f30fe8d28f7ca495467960f8e0e06125c56dad44e44"} Apr 23 13:42:41.698403 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:41.698340 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" event={"ID":"02fc9ade-c9a8-4107-ae3a-c7241d49c136","Type":"ContainerStarted","Data":"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5"} Apr 23 13:42:43.705267 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:43.705236 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" event={"ID":"02fc9ade-c9a8-4107-ae3a-c7241d49c136","Type":"ContainerStarted","Data":"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68"} Apr 23 13:42:43.705709 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:43.705401 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:43.722047 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:43.721889 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" podStartSLOduration=2.348068709 podStartE2EDuration="5.721875845s" podCreationTimestamp="2026-04-23 13:42:38 +0000 UTC" firstStartedPulling="2026-04-23 13:42:39.643037647 +0000 UTC m=+662.630278504" lastFinishedPulling="2026-04-23 13:42:43.016844784 +0000 UTC m=+666.004085640" observedRunningTime="2026-04-23 13:42:43.721752576 +0000 UTC m=+666.708993455" watchObservedRunningTime="2026-04-23 13:42:43.721875845 +0000 UTC m=+666.709116727" Apr 23 13:42:44.707937 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:44.707886 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:44.709400 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:44.709383 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:42:51.716774 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:42:51.716746 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:44:23.609224 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.609191 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-wrwkg_02fc9ade-c9a8-4107-ae3a-c7241d49c136/kserve-container/0.log" Apr 23 13:44:23.825387 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.825344 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl"] Apr 23 13:44:23.828196 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.828178 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.830815 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.830797 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 23 13:44:23.830896 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.830838 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 23 13:44:23.839457 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.839437 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl"] Apr 23 13:44:23.860626 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.860565 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86311db1-2e63-4dad-835d-3df7ea5e0972-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.860812 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.860785 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hklx\" (UniqueName: \"kubernetes.io/projected/86311db1-2e63-4dad-835d-3df7ea5e0972-kube-api-access-6hklx\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.860950 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.860837 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86311db1-2e63-4dad-835d-3df7ea5e0972-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.860950 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.860911 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86311db1-2e63-4dad-835d-3df7ea5e0972-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.878583 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.878559 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg"] Apr 23 13:44:23.878879 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.878853 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerName="kserve-container" containerID="cri-o://4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5" gracePeriod=30 Apr 23 13:44:23.879019 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.878872 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerName="kube-rbac-proxy" containerID="cri-o://54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68" gracePeriod=30 Apr 23 13:44:23.961443 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.961415 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86311db1-2e63-4dad-835d-3df7ea5e0972-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.961549 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.961451 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hklx\" (UniqueName: \"kubernetes.io/projected/86311db1-2e63-4dad-835d-3df7ea5e0972-kube-api-access-6hklx\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.961549 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.961472 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86311db1-2e63-4dad-835d-3df7ea5e0972-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.961549 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.961502 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86311db1-2e63-4dad-835d-3df7ea5e0972-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.961824 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.961808 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86311db1-2e63-4dad-835d-3df7ea5e0972-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.962119 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.962099 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86311db1-2e63-4dad-835d-3df7ea5e0972-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.963874 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.963857 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86311db1-2e63-4dad-835d-3df7ea5e0972-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:23.969667 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:23.969648 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hklx\" (UniqueName: \"kubernetes.io/projected/86311db1-2e63-4dad-835d-3df7ea5e0972-kube-api-access-6hklx\") pod \"isvc-lightgbm-predictor-bdf964bd-9rkxl\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:24.109940 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.109903 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:44:24.137362 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.137337 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:24.163173 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.163150 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls\") pod \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " Apr 23 13:44:24.163275 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.163201 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsxbq\" (UniqueName: \"kubernetes.io/projected/02fc9ade-c9a8-4107-ae3a-c7241d49c136-kube-api-access-rsxbq\") pod \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " Apr 23 13:44:24.163376 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.163356 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02fc9ade-c9a8-4107-ae3a-c7241d49c136-message-dumper-kube-rbac-proxy-sar-config\") pod \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\" (UID: \"02fc9ade-c9a8-4107-ae3a-c7241d49c136\") " Apr 23 13:44:24.163785 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.163754 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02fc9ade-c9a8-4107-ae3a-c7241d49c136-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "02fc9ade-c9a8-4107-ae3a-c7241d49c136" (UID: "02fc9ade-c9a8-4107-ae3a-c7241d49c136"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:44:24.165359 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.165339 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "02fc9ade-c9a8-4107-ae3a-c7241d49c136" (UID: "02fc9ade-c9a8-4107-ae3a-c7241d49c136"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:24.165427 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.165380 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02fc9ade-c9a8-4107-ae3a-c7241d49c136-kube-api-access-rsxbq" (OuterVolumeSpecName: "kube-api-access-rsxbq") pod "02fc9ade-c9a8-4107-ae3a-c7241d49c136" (UID: "02fc9ade-c9a8-4107-ae3a-c7241d49c136"). InnerVolumeSpecName "kube-api-access-rsxbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:44:24.254082 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.254055 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl"] Apr 23 13:44:24.256890 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:44:24.256853 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86311db1_2e63_4dad_835d_3df7ea5e0972.slice/crio-20898ec5ecbc2a9852ddcca377ca4f7a0373074f91c7a3bb045765cbde4d37fe WatchSource:0}: Error finding container 20898ec5ecbc2a9852ddcca377ca4f7a0373074f91c7a3bb045765cbde4d37fe: Status 404 returned error can't find the container with id 20898ec5ecbc2a9852ddcca377ca4f7a0373074f91c7a3bb045765cbde4d37fe Apr 23 13:44:24.264361 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.264340 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rsxbq\" (UniqueName: \"kubernetes.io/projected/02fc9ade-c9a8-4107-ae3a-c7241d49c136-kube-api-access-rsxbq\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:44:24.264430 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.264362 2582 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02fc9ade-c9a8-4107-ae3a-c7241d49c136-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:44:24.264430 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.264372 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02fc9ade-c9a8-4107-ae3a-c7241d49c136-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:44:24.975557 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.975514 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" event={"ID":"86311db1-2e63-4dad-835d-3df7ea5e0972","Type":"ContainerStarted","Data":"20898ec5ecbc2a9852ddcca377ca4f7a0373074f91c7a3bb045765cbde4d37fe"} Apr 23 13:44:24.977202 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.977146 2582 generic.go:358] "Generic (PLEG): container finished" podID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerID="54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68" exitCode=2 Apr 23 13:44:24.977202 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.977173 2582 generic.go:358] "Generic (PLEG): container finished" podID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerID="4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5" exitCode=2 Apr 23 13:44:24.977390 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.977207 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" event={"ID":"02fc9ade-c9a8-4107-ae3a-c7241d49c136","Type":"ContainerDied","Data":"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68"} Apr 23 13:44:24.977390 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.977234 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" event={"ID":"02fc9ade-c9a8-4107-ae3a-c7241d49c136","Type":"ContainerDied","Data":"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5"} Apr 23 13:44:24.977390 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.977248 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" event={"ID":"02fc9ade-c9a8-4107-ae3a-c7241d49c136","Type":"ContainerDied","Data":"ca53a77531823245c0474f30fe8d28f7ca495467960f8e0e06125c56dad44e44"} Apr 23 13:44:24.977390 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.977248 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg" Apr 23 13:44:24.977390 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.977266 2582 scope.go:117] "RemoveContainer" containerID="54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68" Apr 23 13:44:24.987872 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.987687 2582 scope.go:117] "RemoveContainer" containerID="4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5" Apr 23 13:44:24.997060 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.997038 2582 scope.go:117] "RemoveContainer" containerID="54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68" Apr 23 13:44:24.997292 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:44:24.997271 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68\": container with ID starting with 54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68 not found: ID does not exist" containerID="54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68" Apr 23 13:44:24.997378 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.997302 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68"} err="failed to get container status \"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68\": rpc error: code = NotFound desc = could not find container \"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68\": container with ID starting with 54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68 not found: ID does not exist" Apr 23 13:44:24.997378 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.997326 2582 scope.go:117] "RemoveContainer" containerID="4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5" Apr 23 13:44:24.997556 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:44:24.997533 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5\": container with ID starting with 4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5 not found: ID does not exist" containerID="4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5" Apr 23 13:44:24.997625 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.997563 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5"} err="failed to get container status \"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5\": rpc error: code = NotFound desc = could not find container \"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5\": container with ID starting with 4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5 not found: ID does not exist" Apr 23 13:44:24.997625 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.997582 2582 scope.go:117] "RemoveContainer" containerID="54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68" Apr 23 13:44:24.997854 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.997835 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68"} err="failed to get container status \"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68\": rpc error: code = NotFound desc = could not find container \"54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68\": container with ID starting with 54b1b3818c509fae52f726019d0f218700dde0005ccc20bbb873cbedbf46ae68 not found: ID does not exist" Apr 23 13:44:24.997854 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.997857 2582 scope.go:117] "RemoveContainer" containerID="4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5" Apr 23 13:44:24.998231 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:24.998203 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5"} err="failed to get container status \"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5\": rpc error: code = NotFound desc = could not find container \"4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5\": container with ID starting with 4d51cc1e98baa859c2ee505b47b1e58e3ea71bf007aaf21f55cf2ce9b3f340a5 not found: ID does not exist" Apr 23 13:44:25.001383 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:25.001360 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg"] Apr 23 13:44:25.005391 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:25.005352 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wrwkg"] Apr 23 13:44:25.672910 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:25.672877 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" path="/var/lib/kubelet/pods/02fc9ade-c9a8-4107-ae3a-c7241d49c136/volumes" Apr 23 13:44:27.989089 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:27.989057 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" event={"ID":"86311db1-2e63-4dad-835d-3df7ea5e0972","Type":"ContainerStarted","Data":"029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389"} Apr 23 13:44:32.003726 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:32.003690 2582 generic.go:358] "Generic (PLEG): container finished" podID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerID="029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389" exitCode=0 Apr 23 13:44:32.004099 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:32.003768 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" event={"ID":"86311db1-2e63-4dad-835d-3df7ea5e0972","Type":"ContainerDied","Data":"029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389"} Apr 23 13:44:39.026595 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:39.026563 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" event={"ID":"86311db1-2e63-4dad-835d-3df7ea5e0972","Type":"ContainerStarted","Data":"1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c"} Apr 23 13:44:39.026965 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:39.026604 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" event={"ID":"86311db1-2e63-4dad-835d-3df7ea5e0972","Type":"ContainerStarted","Data":"5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532"} Apr 23 13:44:39.026965 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:39.026810 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:39.044460 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:39.044417 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podStartSLOduration=2.060377499 podStartE2EDuration="16.044404919s" podCreationTimestamp="2026-04-23 13:44:23 +0000 UTC" firstStartedPulling="2026-04-23 13:44:24.260712727 +0000 UTC m=+767.247953586" lastFinishedPulling="2026-04-23 13:44:38.244740136 +0000 UTC m=+781.231981006" observedRunningTime="2026-04-23 13:44:39.043004621 +0000 UTC m=+782.030245501" watchObservedRunningTime="2026-04-23 13:44:39.044404919 +0000 UTC m=+782.031645796" Apr 23 13:44:40.029528 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:40.029491 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:40.030622 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:40.030589 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:41.033139 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:41.033093 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:46.038663 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:46.038630 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:44:46.039227 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:46.039199 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:56.039527 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:44:56.039451 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:45:06.040123 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:45:06.040085 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:45:16.039901 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:45:16.039856 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:45:26.039339 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:45:26.039298 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:45:36.039224 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:45:36.039178 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:45:46.039402 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:45:46.039358 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:45:56.039852 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:45:56.039817 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:46:03.989193 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:03.989158 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl"] Apr 23 13:46:03.989582 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:03.989458 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" containerID="cri-o://5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532" gracePeriod=30 Apr 23 13:46:03.989582 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:03.989511 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kube-rbac-proxy" containerID="cri-o://1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c" gracePeriod=30 Apr 23 13:46:04.102664 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.102634 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275"] Apr 23 13:46:04.103033 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.103019 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerName="kube-rbac-proxy" Apr 23 13:46:04.103111 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.103037 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerName="kube-rbac-proxy" Apr 23 13:46:04.103111 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.103065 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerName="kserve-container" Apr 23 13:46:04.103111 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.103074 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerName="kserve-container" Apr 23 13:46:04.103200 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.103169 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerName="kube-rbac-proxy" Apr 23 13:46:04.103200 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.103181 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="02fc9ade-c9a8-4107-ae3a-c7241d49c136" containerName="kserve-container" Apr 23 13:46:04.106438 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.106414 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.108666 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.108645 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 23 13:46:04.108837 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.108822 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 23 13:46:04.115422 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.115402 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275"] Apr 23 13:46:04.204305 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.204279 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnz2\" (UniqueName: \"kubernetes.io/projected/9167d905-825c-43c3-ae42-63108d9c496b-kube-api-access-8cnz2\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.204424 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.204317 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9167d905-825c-43c3-ae42-63108d9c496b-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.204424 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.204346 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9167d905-825c-43c3-ae42-63108d9c496b-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.204424 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.204376 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9167d905-825c-43c3-ae42-63108d9c496b-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.260018 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.259950 2582 generic.go:358] "Generic (PLEG): container finished" podID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerID="1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c" exitCode=2 Apr 23 13:46:04.260130 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.260025 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" event={"ID":"86311db1-2e63-4dad-835d-3df7ea5e0972","Type":"ContainerDied","Data":"1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c"} Apr 23 13:46:04.305456 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.305427 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnz2\" (UniqueName: \"kubernetes.io/projected/9167d905-825c-43c3-ae42-63108d9c496b-kube-api-access-8cnz2\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.305556 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.305468 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9167d905-825c-43c3-ae42-63108d9c496b-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.305556 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.305487 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9167d905-825c-43c3-ae42-63108d9c496b-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.305556 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.305512 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9167d905-825c-43c3-ae42-63108d9c496b-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.305840 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.305821 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9167d905-825c-43c3-ae42-63108d9c496b-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.306146 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.306129 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9167d905-825c-43c3-ae42-63108d9c496b-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.307989 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.307975 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9167d905-825c-43c3-ae42-63108d9c496b-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.313517 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.313497 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnz2\" (UniqueName: \"kubernetes.io/projected/9167d905-825c-43c3-ae42-63108d9c496b-kube-api-access-8cnz2\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-nc275\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.417669 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.417640 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:04.536056 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:04.536032 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275"] Apr 23 13:46:04.538537 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:46:04.538511 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9167d905_825c_43c3_ae42_63108d9c496b.slice/crio-1d980012c0b27f204150d47db483f25486516288df64bf9f7bfe36f2291471cd WatchSource:0}: Error finding container 1d980012c0b27f204150d47db483f25486516288df64bf9f7bfe36f2291471cd: Status 404 returned error can't find the container with id 1d980012c0b27f204150d47db483f25486516288df64bf9f7bfe36f2291471cd Apr 23 13:46:05.264337 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:05.264302 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" event={"ID":"9167d905-825c-43c3-ae42-63108d9c496b","Type":"ContainerStarted","Data":"dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693"} Apr 23 13:46:05.264337 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:05.264337 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" event={"ID":"9167d905-825c-43c3-ae42-63108d9c496b","Type":"ContainerStarted","Data":"1d980012c0b27f204150d47db483f25486516288df64bf9f7bfe36f2291471cd"} Apr 23 13:46:06.033893 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:06.033857 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 23 13:46:06.039840 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:06.039810 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:46:08.276530 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:08.276444 2582 generic.go:358] "Generic (PLEG): container finished" podID="9167d905-825c-43c3-ae42-63108d9c496b" containerID="dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693" exitCode=0 Apr 23 13:46:08.276530 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:08.276513 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" event={"ID":"9167d905-825c-43c3-ae42-63108d9c496b","Type":"ContainerDied","Data":"dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693"} Apr 23 13:46:09.233872 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.233850 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:46:09.280552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.280521 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" event={"ID":"9167d905-825c-43c3-ae42-63108d9c496b","Type":"ContainerStarted","Data":"96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36"} Apr 23 13:46:09.280939 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.280559 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" event={"ID":"9167d905-825c-43c3-ae42-63108d9c496b","Type":"ContainerStarted","Data":"e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f"} Apr 23 13:46:09.280939 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.280848 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:09.282141 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.282108 2582 generic.go:358] "Generic (PLEG): container finished" podID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerID="5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532" exitCode=0 Apr 23 13:46:09.282257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.282163 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" Apr 23 13:46:09.282257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.282176 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" event={"ID":"86311db1-2e63-4dad-835d-3df7ea5e0972","Type":"ContainerDied","Data":"5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532"} Apr 23 13:46:09.282257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.282214 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl" event={"ID":"86311db1-2e63-4dad-835d-3df7ea5e0972","Type":"ContainerDied","Data":"20898ec5ecbc2a9852ddcca377ca4f7a0373074f91c7a3bb045765cbde4d37fe"} Apr 23 13:46:09.282257 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.282235 2582 scope.go:117] "RemoveContainer" containerID="1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c" Apr 23 13:46:09.289577 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.289560 2582 scope.go:117] "RemoveContainer" containerID="5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532" Apr 23 13:46:09.296528 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.296510 2582 scope.go:117] "RemoveContainer" containerID="029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389" Apr 23 13:46:09.303719 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.303682 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podStartSLOduration=5.303670178 podStartE2EDuration="5.303670178s" podCreationTimestamp="2026-04-23 13:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:46:09.301450207 +0000 UTC m=+872.288691084" watchObservedRunningTime="2026-04-23 13:46:09.303670178 +0000 UTC m=+872.290911059" Apr 23 13:46:09.303964 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.303952 2582 scope.go:117] "RemoveContainer" containerID="1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c" Apr 23 13:46:09.304220 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:46:09.304188 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c\": container with ID starting with 1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c not found: ID does not exist" containerID="1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c" Apr 23 13:46:09.304270 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.304229 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c"} err="failed to get container status \"1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c\": rpc error: code = NotFound desc = could not find container \"1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c\": container with ID starting with 1a5b4bc6ca18de3a65fa04e832d58519c790c8923e49788a58677fe860e92b8c not found: ID does not exist" Apr 23 13:46:09.304270 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.304246 2582 scope.go:117] "RemoveContainer" containerID="5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532" Apr 23 13:46:09.304455 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:46:09.304436 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532\": container with ID starting with 5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532 not found: ID does not exist" containerID="5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532" Apr 23 13:46:09.304519 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.304463 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532"} err="failed to get container status \"5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532\": rpc error: code = NotFound desc = could not find container \"5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532\": container with ID starting with 5eaf98fb555c5a8013cfa16fe887f421fb367397208400985d14eb07045a1532 not found: ID does not exist" Apr 23 13:46:09.304519 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.304486 2582 scope.go:117] "RemoveContainer" containerID="029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389" Apr 23 13:46:09.304711 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:46:09.304691 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389\": container with ID starting with 029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389 not found: ID does not exist" containerID="029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389" Apr 23 13:46:09.304746 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.304718 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389"} err="failed to get container status \"029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389\": rpc error: code = NotFound desc = could not find container \"029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389\": container with ID starting with 029406565f70838fe66d1f3fee61e1c6da92f4ac6d0fd49d2c9487a76a803389 not found: ID does not exist" Apr 23 13:46:09.343555 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.343536 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86311db1-2e63-4dad-835d-3df7ea5e0972-kserve-provision-location\") pod \"86311db1-2e63-4dad-835d-3df7ea5e0972\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " Apr 23 13:46:09.343673 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.343578 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86311db1-2e63-4dad-835d-3df7ea5e0972-proxy-tls\") pod \"86311db1-2e63-4dad-835d-3df7ea5e0972\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " Apr 23 13:46:09.343673 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.343642 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86311db1-2e63-4dad-835d-3df7ea5e0972-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"86311db1-2e63-4dad-835d-3df7ea5e0972\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " Apr 23 13:46:09.343784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.343719 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hklx\" (UniqueName: \"kubernetes.io/projected/86311db1-2e63-4dad-835d-3df7ea5e0972-kube-api-access-6hklx\") pod \"86311db1-2e63-4dad-835d-3df7ea5e0972\" (UID: \"86311db1-2e63-4dad-835d-3df7ea5e0972\") " Apr 23 13:46:09.344012 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.343899 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86311db1-2e63-4dad-835d-3df7ea5e0972-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86311db1-2e63-4dad-835d-3df7ea5e0972" (UID: "86311db1-2e63-4dad-835d-3df7ea5e0972"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:09.344012 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.343977 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86311db1-2e63-4dad-835d-3df7ea5e0972-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:46:09.344165 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.344017 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86311db1-2e63-4dad-835d-3df7ea5e0972-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "86311db1-2e63-4dad-835d-3df7ea5e0972" (UID: "86311db1-2e63-4dad-835d-3df7ea5e0972"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:46:09.345681 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.345657 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86311db1-2e63-4dad-835d-3df7ea5e0972-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "86311db1-2e63-4dad-835d-3df7ea5e0972" (UID: "86311db1-2e63-4dad-835d-3df7ea5e0972"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:46:09.345843 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.345820 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86311db1-2e63-4dad-835d-3df7ea5e0972-kube-api-access-6hklx" (OuterVolumeSpecName: "kube-api-access-6hklx") pod "86311db1-2e63-4dad-835d-3df7ea5e0972" (UID: "86311db1-2e63-4dad-835d-3df7ea5e0972"). InnerVolumeSpecName "kube-api-access-6hklx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:46:09.444690 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.444668 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hklx\" (UniqueName: \"kubernetes.io/projected/86311db1-2e63-4dad-835d-3df7ea5e0972-kube-api-access-6hklx\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:46:09.444690 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.444688 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86311db1-2e63-4dad-835d-3df7ea5e0972-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:46:09.444825 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.444701 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86311db1-2e63-4dad-835d-3df7ea5e0972-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:46:09.602420 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.602393 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl"] Apr 23 13:46:09.607172 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.607138 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-9rkxl"] Apr 23 13:46:09.670115 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:09.670088 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" path="/var/lib/kubelet/pods/86311db1-2e63-4dad-835d-3df7ea5e0972/volumes" Apr 23 13:46:10.285627 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:10.285590 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:10.286714 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:10.286688 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:46:11.287883 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:11.287842 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:46:16.291964 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:16.291909 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:46:16.292447 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:16.292423 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:46:26.292888 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:26.292849 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:46:36.293092 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:36.293054 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:46:37.573825 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:37.573799 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:46:37.574615 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:37.574595 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:46:46.293005 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:46.292965 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:46:56.292855 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:46:56.292815 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:47:06.292656 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:06.292619 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:47:16.292582 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:16.292542 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:47:25.670125 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:25.670099 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:47:34.577629 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.577598 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275"] Apr 23 13:47:34.580169 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.578022 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" containerID="cri-o://e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f" gracePeriod=30 Apr 23 13:47:34.580169 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.578058 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kube-rbac-proxy" containerID="cri-o://96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36" gracePeriod=30 Apr 23 13:47:34.689423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689382 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk"] Apr 23 13:47:34.689682 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689667 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" Apr 23 13:47:34.689682 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689681 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" Apr 23 13:47:34.689815 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689698 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kube-rbac-proxy" Apr 23 13:47:34.689815 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689704 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kube-rbac-proxy" Apr 23 13:47:34.689815 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689712 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="storage-initializer" Apr 23 13:47:34.689815 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689717 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="storage-initializer" Apr 23 13:47:34.689815 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689763 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kserve-container" Apr 23 13:47:34.689815 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.689773 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="86311db1-2e63-4dad-835d-3df7ea5e0972" containerName="kube-rbac-proxy" Apr 23 13:47:34.692900 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.692878 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.695223 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.695197 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 23 13:47:34.695439 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.695425 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 13:47:34.701605 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.701584 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk"] Apr 23 13:47:34.775340 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.775302 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.775340 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.775347 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/000fa011-c531-45ba-8110-09ec4f8d9a78-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.775552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.775366 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff5xf\" (UniqueName: \"kubernetes.io/projected/000fa011-c531-45ba-8110-09ec4f8d9a78-kube-api-access-ff5xf\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.775552 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.775461 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/000fa011-c531-45ba-8110-09ec4f8d9a78-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.876836 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.876741 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/000fa011-c531-45ba-8110-09ec4f8d9a78-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.876836 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.876811 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.876836 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.876832 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/000fa011-c531-45ba-8110-09ec4f8d9a78-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.877127 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.876850 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ff5xf\" (UniqueName: \"kubernetes.io/projected/000fa011-c531-45ba-8110-09ec4f8d9a78-kube-api-access-ff5xf\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.877188 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:47:34.877134 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-serving-cert: secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 23 13:47:34.877241 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:47:34.877228 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls podName:000fa011-c531-45ba-8110-09ec4f8d9a78 nodeName:}" failed. No retries permitted until 2026-04-23 13:47:35.377204299 +0000 UTC m=+958.364445161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls") pod "isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" (UID: "000fa011-c531-45ba-8110-09ec4f8d9a78") : secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 23 13:47:34.877398 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.877374 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/000fa011-c531-45ba-8110-09ec4f8d9a78-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.877673 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.877653 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/000fa011-c531-45ba-8110-09ec4f8d9a78-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:34.886317 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:34.886290 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff5xf\" (UniqueName: \"kubernetes.io/projected/000fa011-c531-45ba-8110-09ec4f8d9a78-kube-api-access-ff5xf\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:35.379635 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:35.379590 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:35.382167 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:35.382137 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:35.515695 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:35.515660 2582 generic.go:358] "Generic (PLEG): container finished" podID="9167d905-825c-43c3-ae42-63108d9c496b" containerID="96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36" exitCode=2 Apr 23 13:47:35.515855 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:35.515700 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" event={"ID":"9167d905-825c-43c3-ae42-63108d9c496b","Type":"ContainerDied","Data":"96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36"} Apr 23 13:47:35.604569 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:35.604531 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:47:35.667545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:35.667507 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:47:35.729347 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:35.729317 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk"] Apr 23 13:47:35.731577 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:47:35.731546 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod000fa011_c531_45ba_8110_09ec4f8d9a78.slice/crio-ae23d11bd2b37f93603046c557a351cc9199c7310489cbfea233eb1413f79aa4 WatchSource:0}: Error finding container ae23d11bd2b37f93603046c557a351cc9199c7310489cbfea233eb1413f79aa4: Status 404 returned error can't find the container with id ae23d11bd2b37f93603046c557a351cc9199c7310489cbfea233eb1413f79aa4 Apr 23 13:47:36.289108 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:36.289061 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 23 13:47:36.520216 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:36.520178 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" event={"ID":"000fa011-c531-45ba-8110-09ec4f8d9a78","Type":"ContainerStarted","Data":"63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43"} Apr 23 13:47:36.520216 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:36.520219 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" event={"ID":"000fa011-c531-45ba-8110-09ec4f8d9a78","Type":"ContainerStarted","Data":"ae23d11bd2b37f93603046c557a351cc9199c7310489cbfea233eb1413f79aa4"} Apr 23 13:47:39.529887 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:39.529848 2582 generic.go:358] "Generic (PLEG): container finished" podID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerID="63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43" exitCode=0 Apr 23 13:47:39.530357 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:39.529883 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" event={"ID":"000fa011-c531-45ba-8110-09ec4f8d9a78","Type":"ContainerDied","Data":"63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43"} Apr 23 13:47:40.377078 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.377054 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:47:40.521770 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.521678 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cnz2\" (UniqueName: \"kubernetes.io/projected/9167d905-825c-43c3-ae42-63108d9c496b-kube-api-access-8cnz2\") pod \"9167d905-825c-43c3-ae42-63108d9c496b\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " Apr 23 13:47:40.521770 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.521757 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9167d905-825c-43c3-ae42-63108d9c496b-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"9167d905-825c-43c3-ae42-63108d9c496b\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " Apr 23 13:47:40.522079 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.521798 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9167d905-825c-43c3-ae42-63108d9c496b-proxy-tls\") pod \"9167d905-825c-43c3-ae42-63108d9c496b\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " Apr 23 13:47:40.522079 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.521976 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9167d905-825c-43c3-ae42-63108d9c496b-kserve-provision-location\") pod \"9167d905-825c-43c3-ae42-63108d9c496b\" (UID: \"9167d905-825c-43c3-ae42-63108d9c496b\") " Apr 23 13:47:40.523024 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.522981 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9167d905-825c-43c3-ae42-63108d9c496b-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "9167d905-825c-43c3-ae42-63108d9c496b" (UID: "9167d905-825c-43c3-ae42-63108d9c496b"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:47:40.523317 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.523294 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9167d905-825c-43c3-ae42-63108d9c496b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9167d905-825c-43c3-ae42-63108d9c496b" (UID: "9167d905-825c-43c3-ae42-63108d9c496b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:40.526242 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.526121 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9167d905-825c-43c3-ae42-63108d9c496b-kube-api-access-8cnz2" (OuterVolumeSpecName: "kube-api-access-8cnz2") pod "9167d905-825c-43c3-ae42-63108d9c496b" (UID: "9167d905-825c-43c3-ae42-63108d9c496b"). InnerVolumeSpecName "kube-api-access-8cnz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:47:40.531329 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.531297 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9167d905-825c-43c3-ae42-63108d9c496b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9167d905-825c-43c3-ae42-63108d9c496b" (UID: "9167d905-825c-43c3-ae42-63108d9c496b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:47:40.543234 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.538777 2582 generic.go:358] "Generic (PLEG): container finished" podID="9167d905-825c-43c3-ae42-63108d9c496b" containerID="e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f" exitCode=0 Apr 23 13:47:40.543234 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.538833 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" event={"ID":"9167d905-825c-43c3-ae42-63108d9c496b","Type":"ContainerDied","Data":"e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f"} Apr 23 13:47:40.543234 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.538866 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" event={"ID":"9167d905-825c-43c3-ae42-63108d9c496b","Type":"ContainerDied","Data":"1d980012c0b27f204150d47db483f25486516288df64bf9f7bfe36f2291471cd"} Apr 23 13:47:40.543234 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.538887 2582 scope.go:117] "RemoveContainer" containerID="96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36" Apr 23 13:47:40.543234 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.539111 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275" Apr 23 13:47:40.558900 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.558804 2582 scope.go:117] "RemoveContainer" containerID="e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f" Apr 23 13:47:40.572962 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.570479 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275"] Apr 23 13:47:40.572962 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.572860 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-nc275"] Apr 23 13:47:40.578375 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.578348 2582 scope.go:117] "RemoveContainer" containerID="dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693" Apr 23 13:47:40.598743 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.598716 2582 scope.go:117] "RemoveContainer" containerID="96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36" Apr 23 13:47:40.599162 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:47:40.599123 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36\": container with ID starting with 96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36 not found: ID does not exist" containerID="96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36" Apr 23 13:47:40.599253 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.599169 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36"} err="failed to get container status \"96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36\": rpc error: code = NotFound desc = could not find container \"96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36\": container with ID starting with 96262ade8e65c71c04a5a254c986e1de3c7bb2315a8a89d22ae7239710ca7a36 not found: ID does not exist" Apr 23 13:47:40.599253 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.599196 2582 scope.go:117] "RemoveContainer" containerID="e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f" Apr 23 13:47:40.599728 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:47:40.599693 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f\": container with ID starting with e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f not found: ID does not exist" containerID="e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f" Apr 23 13:47:40.599817 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.599728 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f"} err="failed to get container status \"e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f\": rpc error: code = NotFound desc = could not find container \"e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f\": container with ID starting with e22d5232ebaff6d7530056466e3c1ef59a19e16b07b4529fcb743163a1c1af6f not found: ID does not exist" Apr 23 13:47:40.599817 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.599751 2582 scope.go:117] "RemoveContainer" containerID="dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693" Apr 23 13:47:40.600007 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:47:40.599981 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693\": container with ID starting with dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693 not found: ID does not exist" containerID="dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693" Apr 23 13:47:40.600092 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.600016 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693"} err="failed to get container status \"dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693\": rpc error: code = NotFound desc = could not find container \"dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693\": container with ID starting with dd2e706632bec3cbed5127ed732705b6b5c10fda1b0a216144cd4329501e3693 not found: ID does not exist" Apr 23 13:47:40.623074 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.622977 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9167d905-825c-43c3-ae42-63108d9c496b-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:47:40.623074 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.623018 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8cnz2\" (UniqueName: \"kubernetes.io/projected/9167d905-825c-43c3-ae42-63108d9c496b-kube-api-access-8cnz2\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:47:40.623074 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.623039 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9167d905-825c-43c3-ae42-63108d9c496b-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:47:40.623074 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:40.623054 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9167d905-825c-43c3-ae42-63108d9c496b-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:47:41.672302 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:47:41.671775 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9167d905-825c-43c3-ae42-63108d9c496b" path="/var/lib/kubelet/pods/9167d905-825c-43c3-ae42-63108d9c496b/volumes" Apr 23 13:49:54.279833 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:49:54.279802 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:49:54.971462 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:49:54.971421 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" event={"ID":"000fa011-c531-45ba-8110-09ec4f8d9a78","Type":"ContainerStarted","Data":"c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59"} Apr 23 13:49:54.971462 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:49:54.971463 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" event={"ID":"000fa011-c531-45ba-8110-09ec4f8d9a78","Type":"ContainerStarted","Data":"c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18"} Apr 23 13:49:54.971688 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:49:54.971557 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:49:54.971688 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:49:54.971595 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:49:55.000875 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:49:55.000818 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" podStartSLOduration=6.340940062 podStartE2EDuration="2m21.000801737s" podCreationTimestamp="2026-04-23 13:47:34 +0000 UTC" firstStartedPulling="2026-04-23 13:47:39.531158205 +0000 UTC m=+962.518399067" lastFinishedPulling="2026-04-23 13:49:54.191019872 +0000 UTC m=+1097.178260742" observedRunningTime="2026-04-23 13:49:54.999617599 +0000 UTC m=+1097.986858477" watchObservedRunningTime="2026-04-23 13:49:55.000801737 +0000 UTC m=+1097.988042616" Apr 23 13:50:00.981231 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:00.981199 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:50:30.984554 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:30.984521 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:50:34.883762 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:34.883730 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk"] Apr 23 13:50:34.884156 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:34.884038 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kserve-container" containerID="cri-o://c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18" gracePeriod=30 Apr 23 13:50:34.884156 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:34.884127 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kube-rbac-proxy" containerID="cri-o://c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59" gracePeriod=30 Apr 23 13:50:35.001605 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.001575 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz"] Apr 23 13:50:35.001989 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.001965 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="storage-initializer" Apr 23 13:50:35.001989 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.001988 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="storage-initializer" Apr 23 13:50:35.002187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.002009 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" Apr 23 13:50:35.002187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.002017 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" Apr 23 13:50:35.002187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.002056 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kube-rbac-proxy" Apr 23 13:50:35.002187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.002062 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kube-rbac-proxy" Apr 23 13:50:35.002187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.002112 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kserve-container" Apr 23 13:50:35.002187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.002119 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="9167d905-825c-43c3-ae42-63108d9c496b" containerName="kube-rbac-proxy" Apr 23 13:50:35.005439 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.005421 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.007847 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.007823 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 23 13:50:35.007847 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.007840 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 13:50:35.013957 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.013911 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz"] Apr 23 13:50:35.055933 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.055893 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttvwg\" (UniqueName: \"kubernetes.io/projected/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kube-api-access-ttvwg\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.056139 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.055960 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.056945 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.056445 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.062056 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.062006 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3f6f9dab-3b1d-4e2e-b785-5d8981853772-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.087943 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.087883 2582 generic.go:358] "Generic (PLEG): container finished" podID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerID="c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59" exitCode=2 Apr 23 13:50:35.088069 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.087957 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" event={"ID":"000fa011-c531-45ba-8110-09ec4f8d9a78","Type":"ContainerDied","Data":"c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59"} Apr 23 13:50:35.163509 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.163423 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.163509 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.163478 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3f6f9dab-3b1d-4e2e-b785-5d8981853772-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.163509 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.163504 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttvwg\" (UniqueName: \"kubernetes.io/projected/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kube-api-access-ttvwg\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.163754 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.163537 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.163754 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:50:35.163610 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-serving-cert: secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 23 13:50:35.163754 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:50:35.163700 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls podName:3f6f9dab-3b1d-4e2e-b785-5d8981853772 nodeName:}" failed. No retries permitted until 2026-04-23 13:50:35.663677861 +0000 UTC m=+1138.650918734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls") pod "isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" (UID: "3f6f9dab-3b1d-4e2e-b785-5d8981853772") : secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 23 13:50:35.164019 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.163998 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.164291 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.164270 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3f6f9dab-3b1d-4e2e-b785-5d8981853772-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.174237 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.174212 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttvwg\" (UniqueName: \"kubernetes.io/projected/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kube-api-access-ttvwg\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.666805 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.666763 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.669537 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.669506 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:35.917197 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:35.917123 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:36.044683 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.044658 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:50:36.054458 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.054426 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz"] Apr 23 13:50:36.057632 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:50:36.057594 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6f9dab_3b1d_4e2e_b785_5d8981853772.slice/crio-a81c25c0d06eb42866aee26284baa3a7fc7658a1ed50dd2072a4d5fec55c46e1 WatchSource:0}: Error finding container a81c25c0d06eb42866aee26284baa3a7fc7658a1ed50dd2072a4d5fec55c46e1: Status 404 returned error can't find the container with id a81c25c0d06eb42866aee26284baa3a7fc7658a1ed50dd2072a4d5fec55c46e1 Apr 23 13:50:36.070475 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.070445 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls\") pod \"000fa011-c531-45ba-8110-09ec4f8d9a78\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " Apr 23 13:50:36.070597 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.070512 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/000fa011-c531-45ba-8110-09ec4f8d9a78-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"000fa011-c531-45ba-8110-09ec4f8d9a78\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " Apr 23 13:50:36.070597 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.070559 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/000fa011-c531-45ba-8110-09ec4f8d9a78-kserve-provision-location\") pod \"000fa011-c531-45ba-8110-09ec4f8d9a78\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " Apr 23 13:50:36.070699 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.070622 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff5xf\" (UniqueName: \"kubernetes.io/projected/000fa011-c531-45ba-8110-09ec4f8d9a78-kube-api-access-ff5xf\") pod \"000fa011-c531-45ba-8110-09ec4f8d9a78\" (UID: \"000fa011-c531-45ba-8110-09ec4f8d9a78\") " Apr 23 13:50:36.070965 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.070940 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/000fa011-c531-45ba-8110-09ec4f8d9a78-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "000fa011-c531-45ba-8110-09ec4f8d9a78" (UID: "000fa011-c531-45ba-8110-09ec4f8d9a78"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:50:36.071083 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.070909 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000fa011-c531-45ba-8110-09ec4f8d9a78-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "000fa011-c531-45ba-8110-09ec4f8d9a78" (UID: "000fa011-c531-45ba-8110-09ec4f8d9a78"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:50:36.072946 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.072904 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000fa011-c531-45ba-8110-09ec4f8d9a78-kube-api-access-ff5xf" (OuterVolumeSpecName: "kube-api-access-ff5xf") pod "000fa011-c531-45ba-8110-09ec4f8d9a78" (UID: "000fa011-c531-45ba-8110-09ec4f8d9a78"). InnerVolumeSpecName "kube-api-access-ff5xf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:50:36.073208 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.073187 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "000fa011-c531-45ba-8110-09ec4f8d9a78" (UID: "000fa011-c531-45ba-8110-09ec4f8d9a78"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:50:36.092807 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.092776 2582 generic.go:358] "Generic (PLEG): container finished" podID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerID="c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18" exitCode=0 Apr 23 13:50:36.093000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.092855 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" Apr 23 13:50:36.093000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.092884 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" event={"ID":"000fa011-c531-45ba-8110-09ec4f8d9a78","Type":"ContainerDied","Data":"c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18"} Apr 23 13:50:36.093000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.092909 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" event={"ID":"000fa011-c531-45ba-8110-09ec4f8d9a78","Type":"ContainerDied","Data":"ae23d11bd2b37f93603046c557a351cc9199c7310489cbfea233eb1413f79aa4"} Apr 23 13:50:36.093000 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.092947 2582 scope.go:117] "RemoveContainer" containerID="c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59" Apr 23 13:50:36.094012 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.093995 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" event={"ID":"3f6f9dab-3b1d-4e2e-b785-5d8981853772","Type":"ContainerStarted","Data":"a81c25c0d06eb42866aee26284baa3a7fc7658a1ed50dd2072a4d5fec55c46e1"} Apr 23 13:50:36.102058 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.102036 2582 scope.go:117] "RemoveContainer" containerID="c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18" Apr 23 13:50:36.113717 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.113684 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk"] Apr 23 13:50:36.117814 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.117790 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk"] Apr 23 13:50:36.121149 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.121132 2582 scope.go:117] "RemoveContainer" containerID="63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43" Apr 23 13:50:36.128508 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.128489 2582 scope.go:117] "RemoveContainer" containerID="c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59" Apr 23 13:50:36.128812 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:50:36.128788 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59\": container with ID starting with c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59 not found: ID does not exist" containerID="c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59" Apr 23 13:50:36.128884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.128820 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59"} err="failed to get container status \"c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59\": rpc error: code = NotFound desc = could not find container \"c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59\": container with ID starting with c785309052d305df804cc6cfda752ea3a6db8ed6ad9f325f19be1061a417aa59 not found: ID does not exist" Apr 23 13:50:36.128884 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.128838 2582 scope.go:117] "RemoveContainer" containerID="c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18" Apr 23 13:50:36.129195 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:50:36.129100 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18\": container with ID starting with c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18 not found: ID does not exist" containerID="c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18" Apr 23 13:50:36.129195 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.129131 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18"} err="failed to get container status \"c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18\": rpc error: code = NotFound desc = could not find container \"c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18\": container with ID starting with c8e82a677366feb46de5592557c65749a219a5408c72a2fdf5aa8c25fa241b18 not found: ID does not exist" Apr 23 13:50:36.129195 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.129148 2582 scope.go:117] "RemoveContainer" containerID="63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43" Apr 23 13:50:36.129402 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:50:36.129381 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43\": container with ID starting with 63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43 not found: ID does not exist" containerID="63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43" Apr 23 13:50:36.129463 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.129405 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43"} err="failed to get container status \"63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43\": rpc error: code = NotFound desc = could not find container \"63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43\": container with ID starting with 63b46b0a0716c925f846957698187a49dd71045a710d01c2bcd9ee1c89f67a43 not found: ID does not exist" Apr 23 13:50:36.172118 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.172040 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/000fa011-c531-45ba-8110-09ec4f8d9a78-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:50:36.172118 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.172069 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/000fa011-c531-45ba-8110-09ec4f8d9a78-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:50:36.172118 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.172080 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/000fa011-c531-45ba-8110-09ec4f8d9a78-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:50:36.172118 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.172092 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ff5xf\" (UniqueName: \"kubernetes.io/projected/000fa011-c531-45ba-8110-09ec4f8d9a78-kube-api-access-ff5xf\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:50:36.977791 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:36.977747 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-tdbmk" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 23 13:50:37.099145 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:37.099105 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" event={"ID":"3f6f9dab-3b1d-4e2e-b785-5d8981853772","Type":"ContainerStarted","Data":"8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6"} Apr 23 13:50:37.669899 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:37.669857 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" path="/var/lib/kubelet/pods/000fa011-c531-45ba-8110-09ec4f8d9a78/volumes" Apr 23 13:50:40.113899 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:40.113867 2582 generic.go:358] "Generic (PLEG): container finished" podID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerID="8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6" exitCode=0 Apr 23 13:50:40.114231 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:40.113933 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" event={"ID":"3f6f9dab-3b1d-4e2e-b785-5d8981853772","Type":"ContainerDied","Data":"8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6"} Apr 23 13:50:41.119099 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:41.119061 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" event={"ID":"3f6f9dab-3b1d-4e2e-b785-5d8981853772","Type":"ContainerStarted","Data":"a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c"} Apr 23 13:50:41.119099 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:41.119104 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" event={"ID":"3f6f9dab-3b1d-4e2e-b785-5d8981853772","Type":"ContainerStarted","Data":"6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f"} Apr 23 13:50:41.119537 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:41.119382 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:41.119537 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:41.119419 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:41.120560 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:41.120534 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:50:41.145488 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:41.145439 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" podStartSLOduration=7.145427278 podStartE2EDuration="7.145427278s" podCreationTimestamp="2026-04-23 13:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:50:41.145004751 +0000 UTC m=+1144.132245631" watchObservedRunningTime="2026-04-23 13:50:41.145427278 +0000 UTC m=+1144.132668215" Apr 23 13:50:42.125259 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:42.125210 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:50:47.129715 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:47.129681 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:50:47.130292 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:47.130257 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:50:57.131176 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:50:57.131088 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:51:05.025216 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.025175 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz"] Apr 23 13:51:05.025716 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.025483 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kserve-container" containerID="cri-o://6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f" gracePeriod=30 Apr 23 13:51:05.025716 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.025554 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kube-rbac-proxy" containerID="cri-o://a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c" gracePeriod=30 Apr 23 13:51:05.117954 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.117899 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx"] Apr 23 13:51:05.118266 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.118249 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kserve-container" Apr 23 13:51:05.118346 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.118270 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kserve-container" Apr 23 13:51:05.118346 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.118283 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="storage-initializer" Apr 23 13:51:05.118346 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.118291 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="storage-initializer" Apr 23 13:51:05.118346 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.118317 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kube-rbac-proxy" Apr 23 13:51:05.118346 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.118326 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kube-rbac-proxy" Apr 23 13:51:05.118590 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.118409 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kserve-container" Apr 23 13:51:05.118590 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.118426 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="000fa011-c531-45ba-8110-09ec4f8d9a78" containerName="kube-rbac-proxy" Apr 23 13:51:05.121596 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.121575 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.123985 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.123959 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 13:51:05.123985 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.123976 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 23 13:51:05.130062 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.130037 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx"] Apr 23 13:51:05.187845 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.187798 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f10928-3dff-43ee-993f-7163a75261a3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.188082 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.187854 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f10928-3dff-43ee-993f-7163a75261a3-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.188082 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.187890 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtbx5\" (UniqueName: \"kubernetes.io/projected/56f10928-3dff-43ee-993f-7163a75261a3-kube-api-access-mtbx5\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.188082 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.187981 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f10928-3dff-43ee-993f-7163a75261a3-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.192430 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.192397 2582 generic.go:358] "Generic (PLEG): container finished" podID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerID="a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c" exitCode=2 Apr 23 13:51:05.192559 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.192474 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" event={"ID":"3f6f9dab-3b1d-4e2e-b785-5d8981853772","Type":"ContainerDied","Data":"a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c"} Apr 23 13:51:05.288933 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.288820 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f10928-3dff-43ee-993f-7163a75261a3-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.288933 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.288893 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f10928-3dff-43ee-993f-7163a75261a3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.289148 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.288940 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f10928-3dff-43ee-993f-7163a75261a3-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.289148 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.288964 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtbx5\" (UniqueName: \"kubernetes.io/projected/56f10928-3dff-43ee-993f-7163a75261a3-kube-api-access-mtbx5\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.289451 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.289423 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f10928-3dff-43ee-993f-7163a75261a3-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.289767 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.289714 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f10928-3dff-43ee-993f-7163a75261a3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.291598 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.291574 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f10928-3dff-43ee-993f-7163a75261a3-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.296938 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.296897 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtbx5\" (UniqueName: \"kubernetes.io/projected/56f10928-3dff-43ee-993f-7163a75261a3-kube-api-access-mtbx5\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.432585 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.432544 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:05.573500 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:05.573473 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx"] Apr 23 13:51:05.599029 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:51:05.598989 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f10928_3dff_43ee_993f_7163a75261a3.slice/crio-e66aea214dd570b443883b06e89f3494877bb223a03d94121f5b8773ec69c5bc WatchSource:0}: Error finding container e66aea214dd570b443883b06e89f3494877bb223a03d94121f5b8773ec69c5bc: Status 404 returned error can't find the container with id e66aea214dd570b443883b06e89f3494877bb223a03d94121f5b8773ec69c5bc Apr 23 13:51:06.165761 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.165737 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:51:06.198255 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.198217 2582 generic.go:358] "Generic (PLEG): container finished" podID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerID="6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f" exitCode=0 Apr 23 13:51:06.198433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.198294 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" Apr 23 13:51:06.198433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.198300 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" event={"ID":"3f6f9dab-3b1d-4e2e-b785-5d8981853772","Type":"ContainerDied","Data":"6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f"} Apr 23 13:51:06.198433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.198341 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz" event={"ID":"3f6f9dab-3b1d-4e2e-b785-5d8981853772","Type":"ContainerDied","Data":"a81c25c0d06eb42866aee26284baa3a7fc7658a1ed50dd2072a4d5fec55c46e1"} Apr 23 13:51:06.198433 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.198365 2582 scope.go:117] "RemoveContainer" containerID="a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c" Apr 23 13:51:06.199722 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.199701 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" event={"ID":"56f10928-3dff-43ee-993f-7163a75261a3","Type":"ContainerStarted","Data":"23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b"} Apr 23 13:51:06.199828 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.199732 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" event={"ID":"56f10928-3dff-43ee-993f-7163a75261a3","Type":"ContainerStarted","Data":"e66aea214dd570b443883b06e89f3494877bb223a03d94121f5b8773ec69c5bc"} Apr 23 13:51:06.206953 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.206932 2582 scope.go:117] "RemoveContainer" containerID="6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f" Apr 23 13:51:06.214213 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.214194 2582 scope.go:117] "RemoveContainer" containerID="8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6" Apr 23 13:51:06.221464 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.221445 2582 scope.go:117] "RemoveContainer" containerID="a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c" Apr 23 13:51:06.221725 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:51:06.221702 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c\": container with ID starting with a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c not found: ID does not exist" containerID="a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c" Apr 23 13:51:06.221794 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.221739 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c"} err="failed to get container status \"a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c\": rpc error: code = NotFound desc = could not find container \"a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c\": container with ID starting with a614e85f12321089f9e149dde78473cc0321c65e895dcf58721900557e840c8c not found: ID does not exist" Apr 23 13:51:06.221794 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.221766 2582 scope.go:117] "RemoveContainer" containerID="6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f" Apr 23 13:51:06.222044 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:51:06.222028 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f\": container with ID starting with 6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f not found: ID does not exist" containerID="6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f" Apr 23 13:51:06.222092 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.222050 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f"} err="failed to get container status \"6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f\": rpc error: code = NotFound desc = could not find container \"6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f\": container with ID starting with 6a1f1139ae80dd7e63145484be5f057b36eafb522cd45e5dfdebc9f330fa3d0f not found: ID does not exist" Apr 23 13:51:06.222092 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.222072 2582 scope.go:117] "RemoveContainer" containerID="8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6" Apr 23 13:51:06.222320 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:51:06.222300 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6\": container with ID starting with 8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6 not found: ID does not exist" containerID="8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6" Apr 23 13:51:06.222373 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.222326 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6"} err="failed to get container status \"8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6\": rpc error: code = NotFound desc = could not find container \"8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6\": container with ID starting with 8c642ba0e35d131385d02600a2ede4a85c36819b4a8de8b66e54fa82f0ee22d6 not found: ID does not exist" Apr 23 13:51:06.297749 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.297652 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls\") pod \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " Apr 23 13:51:06.297749 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.297729 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttvwg\" (UniqueName: \"kubernetes.io/projected/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kube-api-access-ttvwg\") pod \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " Apr 23 13:51:06.297749 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.297754 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kserve-provision-location\") pod \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " Apr 23 13:51:06.298082 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.297776 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3f6f9dab-3b1d-4e2e-b785-5d8981853772-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\" (UID: \"3f6f9dab-3b1d-4e2e-b785-5d8981853772\") " Apr 23 13:51:06.298209 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.298178 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3f6f9dab-3b1d-4e2e-b785-5d8981853772" (UID: "3f6f9dab-3b1d-4e2e-b785-5d8981853772"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:06.298267 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.298210 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f6f9dab-3b1d-4e2e-b785-5d8981853772-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "3f6f9dab-3b1d-4e2e-b785-5d8981853772" (UID: "3f6f9dab-3b1d-4e2e-b785-5d8981853772"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:51:06.299992 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.299968 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3f6f9dab-3b1d-4e2e-b785-5d8981853772" (UID: "3f6f9dab-3b1d-4e2e-b785-5d8981853772"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:51:06.300364 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.300346 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kube-api-access-ttvwg" (OuterVolumeSpecName: "kube-api-access-ttvwg") pod "3f6f9dab-3b1d-4e2e-b785-5d8981853772" (UID: "3f6f9dab-3b1d-4e2e-b785-5d8981853772"). InnerVolumeSpecName "kube-api-access-ttvwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:51:06.398907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.398858 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f6f9dab-3b1d-4e2e-b785-5d8981853772-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:51:06.398907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.398904 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ttvwg\" (UniqueName: \"kubernetes.io/projected/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kube-api-access-ttvwg\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:51:06.398907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.398944 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f6f9dab-3b1d-4e2e-b785-5d8981853772-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:51:06.399195 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.398960 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3f6f9dab-3b1d-4e2e-b785-5d8981853772-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:51:06.519991 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.519939 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz"] Apr 23 13:51:06.525026 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:06.524993 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-h2ckz"] Apr 23 13:51:07.670137 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:07.670103 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" path="/var/lib/kubelet/pods/3f6f9dab-3b1d-4e2e-b785-5d8981853772/volumes" Apr 23 13:51:10.214797 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:10.214757 2582 generic.go:358] "Generic (PLEG): container finished" podID="56f10928-3dff-43ee-993f-7163a75261a3" containerID="23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b" exitCode=0 Apr 23 13:51:10.215234 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:10.214831 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" event={"ID":"56f10928-3dff-43ee-993f-7163a75261a3","Type":"ContainerDied","Data":"23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b"} Apr 23 13:51:11.220071 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:11.220033 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" event={"ID":"56f10928-3dff-43ee-993f-7163a75261a3","Type":"ContainerStarted","Data":"250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46"} Apr 23 13:51:11.220071 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:11.220075 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" event={"ID":"56f10928-3dff-43ee-993f-7163a75261a3","Type":"ContainerStarted","Data":"1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e"} Apr 23 13:51:11.220553 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:11.220398 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:11.220553 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:11.220430 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:11.242506 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:11.242448 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" podStartSLOduration=6.242431434 podStartE2EDuration="6.242431434s" podCreationTimestamp="2026-04-23 13:51:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:51:11.241175662 +0000 UTC m=+1174.228416565" watchObservedRunningTime="2026-04-23 13:51:11.242431434 +0000 UTC m=+1174.229672312" Apr 23 13:51:17.229333 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:17.229301 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:37.592480 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:37.592449 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:51:37.593513 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:37.593493 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:51:47.233155 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:47.233126 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:55.188848 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:55.188815 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx"] Apr 23 13:51:55.189385 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:55.189174 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="kserve-container" containerID="cri-o://1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e" gracePeriod=30 Apr 23 13:51:55.189385 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:55.189215 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="kube-rbac-proxy" containerID="cri-o://250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46" gracePeriod=30 Apr 23 13:51:55.346276 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:55.346238 2582 generic.go:358] "Generic (PLEG): container finished" podID="56f10928-3dff-43ee-993f-7163a75261a3" containerID="250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46" exitCode=2 Apr 23 13:51:55.346444 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:55.346284 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" event={"ID":"56f10928-3dff-43ee-993f-7163a75261a3","Type":"ContainerDied","Data":"250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46"} Apr 23 13:51:56.541588 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.541560 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:56.596217 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.596134 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtbx5\" (UniqueName: \"kubernetes.io/projected/56f10928-3dff-43ee-993f-7163a75261a3-kube-api-access-mtbx5\") pod \"56f10928-3dff-43ee-993f-7163a75261a3\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " Apr 23 13:51:56.596217 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.596182 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f10928-3dff-43ee-993f-7163a75261a3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"56f10928-3dff-43ee-993f-7163a75261a3\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " Apr 23 13:51:56.596217 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.596211 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f10928-3dff-43ee-993f-7163a75261a3-proxy-tls\") pod \"56f10928-3dff-43ee-993f-7163a75261a3\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " Apr 23 13:51:56.596550 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.596269 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f10928-3dff-43ee-993f-7163a75261a3-kserve-provision-location\") pod \"56f10928-3dff-43ee-993f-7163a75261a3\" (UID: \"56f10928-3dff-43ee-993f-7163a75261a3\") " Apr 23 13:51:56.596656 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.596627 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f10928-3dff-43ee-993f-7163a75261a3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "56f10928-3dff-43ee-993f-7163a75261a3" (UID: "56f10928-3dff-43ee-993f-7163a75261a3"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:51:56.596718 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.596700 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f10928-3dff-43ee-993f-7163a75261a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "56f10928-3dff-43ee-993f-7163a75261a3" (UID: "56f10928-3dff-43ee-993f-7163a75261a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:56.598479 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.598456 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f10928-3dff-43ee-993f-7163a75261a3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "56f10928-3dff-43ee-993f-7163a75261a3" (UID: "56f10928-3dff-43ee-993f-7163a75261a3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:51:56.598569 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.598546 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f10928-3dff-43ee-993f-7163a75261a3-kube-api-access-mtbx5" (OuterVolumeSpecName: "kube-api-access-mtbx5") pod "56f10928-3dff-43ee-993f-7163a75261a3" (UID: "56f10928-3dff-43ee-993f-7163a75261a3"). InnerVolumeSpecName "kube-api-access-mtbx5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:51:56.696995 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.696962 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f10928-3dff-43ee-993f-7163a75261a3-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:51:56.696995 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.696993 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtbx5\" (UniqueName: \"kubernetes.io/projected/56f10928-3dff-43ee-993f-7163a75261a3-kube-api-access-mtbx5\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:51:56.697189 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.697003 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56f10928-3dff-43ee-993f-7163a75261a3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:51:56.697189 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:56.697014 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56f10928-3dff-43ee-993f-7163a75261a3-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:51:57.353945 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.353895 2582 generic.go:358] "Generic (PLEG): container finished" podID="56f10928-3dff-43ee-993f-7163a75261a3" containerID="1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e" exitCode=0 Apr 23 13:51:57.354158 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.353983 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" event={"ID":"56f10928-3dff-43ee-993f-7163a75261a3","Type":"ContainerDied","Data":"1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e"} Apr 23 13:51:57.354158 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.354004 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" Apr 23 13:51:57.354158 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.354021 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx" event={"ID":"56f10928-3dff-43ee-993f-7163a75261a3","Type":"ContainerDied","Data":"e66aea214dd570b443883b06e89f3494877bb223a03d94121f5b8773ec69c5bc"} Apr 23 13:51:57.354158 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.354037 2582 scope.go:117] "RemoveContainer" containerID="250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46" Apr 23 13:51:57.364539 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.364519 2582 scope.go:117] "RemoveContainer" containerID="1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e" Apr 23 13:51:57.371899 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.371883 2582 scope.go:117] "RemoveContainer" containerID="23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b" Apr 23 13:51:57.378510 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.378488 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx"] Apr 23 13:51:57.379142 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.379122 2582 scope.go:117] "RemoveContainer" containerID="250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46" Apr 23 13:51:57.379432 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:51:57.379414 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46\": container with ID starting with 250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46 not found: ID does not exist" containerID="250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46" Apr 23 13:51:57.379504 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.379439 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46"} err="failed to get container status \"250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46\": rpc error: code = NotFound desc = could not find container \"250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46\": container with ID starting with 250a0ba09cc747ffedfcfcc9e3407b59cce070c56f3648d3994a443a4bd94e46 not found: ID does not exist" Apr 23 13:51:57.379504 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.379458 2582 scope.go:117] "RemoveContainer" containerID="1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e" Apr 23 13:51:57.379741 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:51:57.379723 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e\": container with ID starting with 1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e not found: ID does not exist" containerID="1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e" Apr 23 13:51:57.379790 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.379745 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e"} err="failed to get container status \"1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e\": rpc error: code = NotFound desc = could not find container \"1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e\": container with ID starting with 1aab453b76e3009ff3715bc6d95385df7fc17356d2a6cef1b65c4fe4c5f6cb2e not found: ID does not exist" Apr 23 13:51:57.379790 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.379767 2582 scope.go:117] "RemoveContainer" containerID="23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b" Apr 23 13:51:57.380079 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:51:57.380035 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b\": container with ID starting with 23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b not found: ID does not exist" containerID="23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b" Apr 23 13:51:57.380079 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.380063 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b"} err="failed to get container status \"23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b\": rpc error: code = NotFound desc = could not find container \"23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b\": container with ID starting with 23d78c249e25610b26ce932cd215254904fe8f3ae5662a474815b4bc987bd48b not found: ID does not exist" Apr 23 13:51:57.381493 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.381472 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-4c4mx"] Apr 23 13:51:57.669399 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:51:57.669363 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f10928-3dff-43ee-993f-7163a75261a3" path="/var/lib/kubelet/pods/56f10928-3dff-43ee-993f-7163a75261a3/volumes" Apr 23 13:53:15.436452 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436419 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx"] Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436681 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kube-rbac-proxy" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436692 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kube-rbac-proxy" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436702 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kserve-container" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436708 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kserve-container" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436716 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="storage-initializer" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436721 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="storage-initializer" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436728 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="storage-initializer" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436733 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="storage-initializer" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436745 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="kserve-container" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436750 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="kserve-container" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436760 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="kube-rbac-proxy" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436766 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="kube-rbac-proxy" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436807 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="kserve-container" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436814 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kserve-container" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436819 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f6f9dab-3b1d-4e2e-b785-5d8981853772" containerName="kube-rbac-proxy" Apr 23 13:53:15.436907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.436826 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="56f10928-3dff-43ee-993f-7163a75261a3" containerName="kube-rbac-proxy" Apr 23 13:53:15.439750 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.439734 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.442392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.442367 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 13:53:15.442392 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.442388 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:53:15.442568 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.442455 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 23 13:53:15.443265 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.443247 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 23 13:53:15.443350 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.443283 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:53:15.450383 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.450362 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx"] Apr 23 13:53:15.477311 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.477284 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1183adb7-cb69-4fcd-a996-8ecb2089715b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.477415 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.477316 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l74bp\" (UniqueName: \"kubernetes.io/projected/1183adb7-cb69-4fcd-a996-8ecb2089715b-kube-api-access-l74bp\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.477415 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.477398 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1183adb7-cb69-4fcd-a996-8ecb2089715b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.477502 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.477420 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1183adb7-cb69-4fcd-a996-8ecb2089715b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.578633 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.578596 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1183adb7-cb69-4fcd-a996-8ecb2089715b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.578804 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.578647 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1183adb7-cb69-4fcd-a996-8ecb2089715b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.578804 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.578704 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1183adb7-cb69-4fcd-a996-8ecb2089715b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.578804 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.578741 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l74bp\" (UniqueName: \"kubernetes.io/projected/1183adb7-cb69-4fcd-a996-8ecb2089715b-kube-api-access-l74bp\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.579341 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.579318 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1183adb7-cb69-4fcd-a996-8ecb2089715b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.579500 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.579482 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1183adb7-cb69-4fcd-a996-8ecb2089715b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.581495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.581463 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1183adb7-cb69-4fcd-a996-8ecb2089715b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.586522 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.586497 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l74bp\" (UniqueName: \"kubernetes.io/projected/1183adb7-cb69-4fcd-a996-8ecb2089715b-kube-api-access-l74bp\") pod \"isvc-paddle-predictor-6b8b7cfb4b-qz6kx\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.752175 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.752088 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:15.874584 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:15.874543 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx"] Apr 23 13:53:15.877291 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:53:15.877263 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1183adb7_cb69_4fcd_a996_8ecb2089715b.slice/crio-470fc8b18196a2caaae34920876d1948761476f007db9a0b7dc8d17f180d57ea WatchSource:0}: Error finding container 470fc8b18196a2caaae34920876d1948761476f007db9a0b7dc8d17f180d57ea: Status 404 returned error can't find the container with id 470fc8b18196a2caaae34920876d1948761476f007db9a0b7dc8d17f180d57ea Apr 23 13:53:16.573975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:16.573937 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" event={"ID":"1183adb7-cb69-4fcd-a996-8ecb2089715b","Type":"ContainerStarted","Data":"792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3"} Apr 23 13:53:16.573975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:16.573977 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" event={"ID":"1183adb7-cb69-4fcd-a996-8ecb2089715b","Type":"ContainerStarted","Data":"470fc8b18196a2caaae34920876d1948761476f007db9a0b7dc8d17f180d57ea"} Apr 23 13:53:20.586303 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:20.586264 2582 generic.go:358] "Generic (PLEG): container finished" podID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerID="792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3" exitCode=0 Apr 23 13:53:20.586646 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:20.586340 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" event={"ID":"1183adb7-cb69-4fcd-a996-8ecb2089715b","Type":"ContainerDied","Data":"792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3"} Apr 23 13:53:31.629768 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:31.629681 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" event={"ID":"1183adb7-cb69-4fcd-a996-8ecb2089715b","Type":"ContainerStarted","Data":"9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea"} Apr 23 13:53:31.629768 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:31.629724 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" event={"ID":"1183adb7-cb69-4fcd-a996-8ecb2089715b","Type":"ContainerStarted","Data":"2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873"} Apr 23 13:53:31.630213 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:31.629954 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:31.650038 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:31.649983 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podStartSLOduration=5.938333979 podStartE2EDuration="16.649968841s" podCreationTimestamp="2026-04-23 13:53:15 +0000 UTC" firstStartedPulling="2026-04-23 13:53:20.587486461 +0000 UTC m=+1303.574727319" lastFinishedPulling="2026-04-23 13:53:31.299121323 +0000 UTC m=+1314.286362181" observedRunningTime="2026-04-23 13:53:31.648237647 +0000 UTC m=+1314.635478554" watchObservedRunningTime="2026-04-23 13:53:31.649968841 +0000 UTC m=+1314.637209716" Apr 23 13:53:32.633471 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:32.633441 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:32.634803 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:32.634774 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 13:53:33.636487 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:33.636445 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 13:53:38.641019 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:38.640988 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:53:38.641710 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:38.641676 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 13:53:48.641795 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:48.641751 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 13:53:58.642404 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:53:58.642314 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 13:54:08.642227 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:08.642187 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 13:54:18.642110 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:18.642076 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:54:26.971167 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:26.971127 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx"] Apr 23 13:54:26.971705 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:26.971546 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kube-rbac-proxy" containerID="cri-o://9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea" gracePeriod=30 Apr 23 13:54:26.971705 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:26.971523 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" containerID="cri-o://2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873" gracePeriod=30 Apr 23 13:54:27.180799 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.180770 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64"] Apr 23 13:54:27.183986 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.183971 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.187297 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.187280 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 23 13:54:27.187493 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.187470 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 23 13:54:27.208798 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.208772 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64"] Apr 23 13:54:27.237829 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.237759 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnsns\" (UniqueName: \"kubernetes.io/projected/553e244d-2d11-4a3b-b422-09efbe6c3610-kube-api-access-xnsns\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.237829 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.237806 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.238016 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.237937 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/553e244d-2d11-4a3b-b422-09efbe6c3610-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.238079 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.238032 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/553e244d-2d11-4a3b-b422-09efbe6c3610-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.338358 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.338328 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/553e244d-2d11-4a3b-b422-09efbe6c3610-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.338495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.338406 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/553e244d-2d11-4a3b-b422-09efbe6c3610-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.338495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.338439 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnsns\" (UniqueName: \"kubernetes.io/projected/553e244d-2d11-4a3b-b422-09efbe6c3610-kube-api-access-xnsns\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.338495 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.338468 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.338658 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:54:27.338594 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-runtime-predictor-serving-cert: secret "isvc-paddle-runtime-predictor-serving-cert" not found Apr 23 13:54:27.338709 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:54:27.338666 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls podName:553e244d-2d11-4a3b-b422-09efbe6c3610 nodeName:}" failed. No retries permitted until 2026-04-23 13:54:27.838643137 +0000 UTC m=+1370.825884008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls") pod "isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" (UID: "553e244d-2d11-4a3b-b422-09efbe6c3610") : secret "isvc-paddle-runtime-predictor-serving-cert" not found Apr 23 13:54:27.338771 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.338715 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/553e244d-2d11-4a3b-b422-09efbe6c3610-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.339141 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.339119 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/553e244d-2d11-4a3b-b422-09efbe6c3610-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.351622 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.351590 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnsns\" (UniqueName: \"kubernetes.io/projected/553e244d-2d11-4a3b-b422-09efbe6c3610-kube-api-access-xnsns\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.780745 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.780713 2582 generic.go:358] "Generic (PLEG): container finished" podID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerID="9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea" exitCode=2 Apr 23 13:54:27.780906 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.780768 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" event={"ID":"1183adb7-cb69-4fcd-a996-8ecb2089715b","Type":"ContainerDied","Data":"9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea"} Apr 23 13:54:27.841206 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.841179 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:27.843660 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:27.843641 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:28.094332 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:28.094252 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:28.220291 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:28.220264 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64"] Apr 23 13:54:28.222190 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:54:28.222164 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod553e244d_2d11_4a3b_b422_09efbe6c3610.slice/crio-ea40671423b84f7000265cb7e8df96976f081fedbe00c55658bd192587901406 WatchSource:0}: Error finding container ea40671423b84f7000265cb7e8df96976f081fedbe00c55658bd192587901406: Status 404 returned error can't find the container with id ea40671423b84f7000265cb7e8df96976f081fedbe00c55658bd192587901406 Apr 23 13:54:28.637636 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:28.637589 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 23 13:54:28.642058 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:28.642029 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 13:54:28.785153 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:28.785119 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" event={"ID":"553e244d-2d11-4a3b-b422-09efbe6c3610","Type":"ContainerStarted","Data":"e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf"} Apr 23 13:54:28.785153 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:28.785155 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" event={"ID":"553e244d-2d11-4a3b-b422-09efbe6c3610","Type":"ContainerStarted","Data":"ea40671423b84f7000265cb7e8df96976f081fedbe00c55658bd192587901406"} Apr 23 13:54:29.606743 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.606719 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:54:29.655305 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.655281 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l74bp\" (UniqueName: \"kubernetes.io/projected/1183adb7-cb69-4fcd-a996-8ecb2089715b-kube-api-access-l74bp\") pod \"1183adb7-cb69-4fcd-a996-8ecb2089715b\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " Apr 23 13:54:29.655429 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.655313 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1183adb7-cb69-4fcd-a996-8ecb2089715b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"1183adb7-cb69-4fcd-a996-8ecb2089715b\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " Apr 23 13:54:29.655429 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.655348 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1183adb7-cb69-4fcd-a996-8ecb2089715b-proxy-tls\") pod \"1183adb7-cb69-4fcd-a996-8ecb2089715b\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " Apr 23 13:54:29.655499 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.655475 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1183adb7-cb69-4fcd-a996-8ecb2089715b-kserve-provision-location\") pod \"1183adb7-cb69-4fcd-a996-8ecb2089715b\" (UID: \"1183adb7-cb69-4fcd-a996-8ecb2089715b\") " Apr 23 13:54:29.655778 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.655753 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1183adb7-cb69-4fcd-a996-8ecb2089715b-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "1183adb7-cb69-4fcd-a996-8ecb2089715b" (UID: "1183adb7-cb69-4fcd-a996-8ecb2089715b"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:54:29.657558 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.657535 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1183adb7-cb69-4fcd-a996-8ecb2089715b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1183adb7-cb69-4fcd-a996-8ecb2089715b" (UID: "1183adb7-cb69-4fcd-a996-8ecb2089715b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:29.657632 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.657535 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1183adb7-cb69-4fcd-a996-8ecb2089715b-kube-api-access-l74bp" (OuterVolumeSpecName: "kube-api-access-l74bp") pod "1183adb7-cb69-4fcd-a996-8ecb2089715b" (UID: "1183adb7-cb69-4fcd-a996-8ecb2089715b"). InnerVolumeSpecName "kube-api-access-l74bp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:29.665007 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.664985 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1183adb7-cb69-4fcd-a996-8ecb2089715b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1183adb7-cb69-4fcd-a996-8ecb2089715b" (UID: "1183adb7-cb69-4fcd-a996-8ecb2089715b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:54:29.756565 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.756509 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1183adb7-cb69-4fcd-a996-8ecb2089715b-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:54:29.756565 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.756531 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l74bp\" (UniqueName: \"kubernetes.io/projected/1183adb7-cb69-4fcd-a996-8ecb2089715b-kube-api-access-l74bp\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:54:29.756565 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.756543 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1183adb7-cb69-4fcd-a996-8ecb2089715b-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:54:29.756565 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.756553 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1183adb7-cb69-4fcd-a996-8ecb2089715b-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:54:29.789106 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.789073 2582 generic.go:358] "Generic (PLEG): container finished" podID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerID="2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873" exitCode=0 Apr 23 13:54:29.789187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.789112 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" event={"ID":"1183adb7-cb69-4fcd-a996-8ecb2089715b","Type":"ContainerDied","Data":"2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873"} Apr 23 13:54:29.789187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.789156 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" event={"ID":"1183adb7-cb69-4fcd-a996-8ecb2089715b","Type":"ContainerDied","Data":"470fc8b18196a2caaae34920876d1948761476f007db9a0b7dc8d17f180d57ea"} Apr 23 13:54:29.789187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.789160 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx" Apr 23 13:54:29.789290 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.789174 2582 scope.go:117] "RemoveContainer" containerID="9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea" Apr 23 13:54:29.797057 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.797033 2582 scope.go:117] "RemoveContainer" containerID="2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873" Apr 23 13:54:29.803784 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.803768 2582 scope.go:117] "RemoveContainer" containerID="792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3" Apr 23 13:54:29.805993 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.805971 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx"] Apr 23 13:54:29.809761 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.809734 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-qz6kx"] Apr 23 13:54:29.810880 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.810864 2582 scope.go:117] "RemoveContainer" containerID="9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea" Apr 23 13:54:29.811144 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:54:29.811127 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea\": container with ID starting with 9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea not found: ID does not exist" containerID="9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea" Apr 23 13:54:29.811188 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.811152 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea"} err="failed to get container status \"9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea\": rpc error: code = NotFound desc = could not find container \"9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea\": container with ID starting with 9bd6cdead09c5212a71d3d82cb6b5e2c598013efe861d7f49eb9bc5c64ffd4ea not found: ID does not exist" Apr 23 13:54:29.811188 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.811168 2582 scope.go:117] "RemoveContainer" containerID="2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873" Apr 23 13:54:29.811365 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:54:29.811351 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873\": container with ID starting with 2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873 not found: ID does not exist" containerID="2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873" Apr 23 13:54:29.811404 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.811368 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873"} err="failed to get container status \"2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873\": rpc error: code = NotFound desc = could not find container \"2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873\": container with ID starting with 2f41e5c3b36be84cb9a0d658c89f4870176fcad9f93fa4359e8711aac2c96873 not found: ID does not exist" Apr 23 13:54:29.811404 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.811380 2582 scope.go:117] "RemoveContainer" containerID="792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3" Apr 23 13:54:29.811553 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:54:29.811538 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3\": container with ID starting with 792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3 not found: ID does not exist" containerID="792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3" Apr 23 13:54:29.811590 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:29.811557 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3"} err="failed to get container status \"792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3\": rpc error: code = NotFound desc = could not find container \"792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3\": container with ID starting with 792008daeacd4e2b6f1dfbe74fd0c52229700d87e48425185e08771ffa614fa3 not found: ID does not exist" Apr 23 13:54:31.669563 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:31.669531 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" path="/var/lib/kubelet/pods/1183adb7-cb69-4fcd-a996-8ecb2089715b/volumes" Apr 23 13:54:32.800092 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:32.800011 2582 generic.go:358] "Generic (PLEG): container finished" podID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerID="e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf" exitCode=0 Apr 23 13:54:32.800425 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:32.800087 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" event={"ID":"553e244d-2d11-4a3b-b422-09efbe6c3610","Type":"ContainerDied","Data":"e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf"} Apr 23 13:54:33.804734 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:33.804703 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" event={"ID":"553e244d-2d11-4a3b-b422-09efbe6c3610","Type":"ContainerStarted","Data":"3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba"} Apr 23 13:54:33.805135 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:33.804741 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" event={"ID":"553e244d-2d11-4a3b-b422-09efbe6c3610","Type":"ContainerStarted","Data":"437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64"} Apr 23 13:54:33.805135 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:33.805055 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:33.805135 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:33.805091 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:33.806408 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:33.806381 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 13:54:33.827335 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:33.827296 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podStartSLOduration=6.827284859 podStartE2EDuration="6.827284859s" podCreationTimestamp="2026-04-23 13:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:33.826278891 +0000 UTC m=+1376.813519788" watchObservedRunningTime="2026-04-23 13:54:33.827284859 +0000 UTC m=+1376.814525737" Apr 23 13:54:34.807936 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:34.807882 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 13:54:39.812625 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:39.812593 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:54:39.813210 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:39.813181 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 13:54:49.813813 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:49.813773 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 13:54:59.813682 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:54:59.813640 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 13:55:09.813598 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:09.813555 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 13:55:19.813711 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:19.813676 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:55:28.371887 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.371853 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64"] Apr 23 13:55:28.372409 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.372181 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" containerID="cri-o://437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64" gracePeriod=30 Apr 23 13:55:28.372409 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.372202 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kube-rbac-proxy" containerID="cri-o://3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba" gracePeriod=30 Apr 23 13:55:28.469769 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.469738 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6"] Apr 23 13:55:28.470045 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.470032 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="storage-initializer" Apr 23 13:55:28.470095 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.470047 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="storage-initializer" Apr 23 13:55:28.470095 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.470060 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" Apr 23 13:55:28.470095 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.470065 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" Apr 23 13:55:28.470095 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.470072 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kube-rbac-proxy" Apr 23 13:55:28.470095 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.470080 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kube-rbac-proxy" Apr 23 13:55:28.470239 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.470135 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kserve-container" Apr 23 13:55:28.470239 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.470142 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1183adb7-cb69-4fcd-a996-8ecb2089715b" containerName="kube-rbac-proxy" Apr 23 13:55:28.473244 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.473228 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.476456 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.476431 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 23 13:55:28.476456 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.476433 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 13:55:28.491255 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.491092 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6"] Apr 23 13:55:28.599520 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.599486 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652nk\" (UniqueName: \"kubernetes.io/projected/0c952a49-dbb7-4899-8e4e-80c16deba78b-kube-api-access-652nk\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.599695 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.599546 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c952a49-dbb7-4899-8e4e-80c16deba78b-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.599695 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.599629 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c952a49-dbb7-4899-8e4e-80c16deba78b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.599695 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.599690 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c952a49-dbb7-4899-8e4e-80c16deba78b-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.700232 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.700196 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-652nk\" (UniqueName: \"kubernetes.io/projected/0c952a49-dbb7-4899-8e4e-80c16deba78b-kube-api-access-652nk\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.700423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.700255 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c952a49-dbb7-4899-8e4e-80c16deba78b-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.700423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.700282 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c952a49-dbb7-4899-8e4e-80c16deba78b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.700423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.700313 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c952a49-dbb7-4899-8e4e-80c16deba78b-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.700740 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.700719 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c952a49-dbb7-4899-8e4e-80c16deba78b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.701299 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.701274 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c952a49-dbb7-4899-8e4e-80c16deba78b-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.702977 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.702960 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c952a49-dbb7-4899-8e4e-80c16deba78b-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.709064 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.709045 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-652nk\" (UniqueName: \"kubernetes.io/projected/0c952a49-dbb7-4899-8e4e-80c16deba78b-kube-api-access-652nk\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.784044 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.784009 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:28.912519 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.912485 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6"] Apr 23 13:55:28.916531 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:55:28.916503 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c952a49_dbb7_4899_8e4e_80c16deba78b.slice/crio-7c625432c6d8512fc1cbdfdd1a1eb023c02280ae2e6bdc89ed58bfad62e72f49 WatchSource:0}: Error finding container 7c625432c6d8512fc1cbdfdd1a1eb023c02280ae2e6bdc89ed58bfad62e72f49: Status 404 returned error can't find the container with id 7c625432c6d8512fc1cbdfdd1a1eb023c02280ae2e6bdc89ed58bfad62e72f49 Apr 23 13:55:28.918187 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.918171 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:55:28.962977 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.962947 2582 generic.go:358] "Generic (PLEG): container finished" podID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerID="3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba" exitCode=2 Apr 23 13:55:28.963108 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.963033 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" event={"ID":"553e244d-2d11-4a3b-b422-09efbe6c3610","Type":"ContainerDied","Data":"3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba"} Apr 23 13:55:28.963858 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:28.963830 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" event={"ID":"0c952a49-dbb7-4899-8e4e-80c16deba78b","Type":"ContainerStarted","Data":"7c625432c6d8512fc1cbdfdd1a1eb023c02280ae2e6bdc89ed58bfad62e72f49"} Apr 23 13:55:29.809192 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:29.809149 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 23 13:55:29.813661 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:29.813637 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 13:55:29.967670 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:29.967635 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" event={"ID":"0c952a49-dbb7-4899-8e4e-80c16deba78b","Type":"ContainerStarted","Data":"4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4"} Apr 23 13:55:31.121607 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.121586 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:55:31.221565 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.221479 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/553e244d-2d11-4a3b-b422-09efbe6c3610-kserve-provision-location\") pod \"553e244d-2d11-4a3b-b422-09efbe6c3610\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " Apr 23 13:55:31.221565 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.221560 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls\") pod \"553e244d-2d11-4a3b-b422-09efbe6c3610\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " Apr 23 13:55:31.221756 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.221586 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnsns\" (UniqueName: \"kubernetes.io/projected/553e244d-2d11-4a3b-b422-09efbe6c3610-kube-api-access-xnsns\") pod \"553e244d-2d11-4a3b-b422-09efbe6c3610\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " Apr 23 13:55:31.221756 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.221606 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/553e244d-2d11-4a3b-b422-09efbe6c3610-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"553e244d-2d11-4a3b-b422-09efbe6c3610\" (UID: \"553e244d-2d11-4a3b-b422-09efbe6c3610\") " Apr 23 13:55:31.222020 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.221994 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553e244d-2d11-4a3b-b422-09efbe6c3610-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "553e244d-2d11-4a3b-b422-09efbe6c3610" (UID: "553e244d-2d11-4a3b-b422-09efbe6c3610"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:55:31.223856 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.223827 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "553e244d-2d11-4a3b-b422-09efbe6c3610" (UID: "553e244d-2d11-4a3b-b422-09efbe6c3610"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:55:31.223982 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.223945 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553e244d-2d11-4a3b-b422-09efbe6c3610-kube-api-access-xnsns" (OuterVolumeSpecName: "kube-api-access-xnsns") pod "553e244d-2d11-4a3b-b422-09efbe6c3610" (UID: "553e244d-2d11-4a3b-b422-09efbe6c3610"). InnerVolumeSpecName "kube-api-access-xnsns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:55:31.230171 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.230145 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/553e244d-2d11-4a3b-b422-09efbe6c3610-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "553e244d-2d11-4a3b-b422-09efbe6c3610" (UID: "553e244d-2d11-4a3b-b422-09efbe6c3610"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:55:31.323147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.323097 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/553e244d-2d11-4a3b-b422-09efbe6c3610-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:55:31.323147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.323142 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/553e244d-2d11-4a3b-b422-09efbe6c3610-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:55:31.323147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.323155 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnsns\" (UniqueName: \"kubernetes.io/projected/553e244d-2d11-4a3b-b422-09efbe6c3610-kube-api-access-xnsns\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:55:31.323147 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.323165 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/553e244d-2d11-4a3b-b422-09efbe6c3610-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:55:31.975794 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.975712 2582 generic.go:358] "Generic (PLEG): container finished" podID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerID="437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64" exitCode=0 Apr 23 13:55:31.975955 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.975790 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" Apr 23 13:55:31.975955 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.975801 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" event={"ID":"553e244d-2d11-4a3b-b422-09efbe6c3610","Type":"ContainerDied","Data":"437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64"} Apr 23 13:55:31.975955 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.975841 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64" event={"ID":"553e244d-2d11-4a3b-b422-09efbe6c3610","Type":"ContainerDied","Data":"ea40671423b84f7000265cb7e8df96976f081fedbe00c55658bd192587901406"} Apr 23 13:55:31.975955 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.975858 2582 scope.go:117] "RemoveContainer" containerID="3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba" Apr 23 13:55:31.983659 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.983642 2582 scope.go:117] "RemoveContainer" containerID="437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64" Apr 23 13:55:31.990801 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.990636 2582 scope.go:117] "RemoveContainer" containerID="e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf" Apr 23 13:55:31.994750 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.994727 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64"] Apr 23 13:55:31.998135 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.998110 2582 scope.go:117] "RemoveContainer" containerID="3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba" Apr 23 13:55:31.998504 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:55:31.998482 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba\": container with ID starting with 3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba not found: ID does not exist" containerID="3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba" Apr 23 13:55:31.998504 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.998517 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba"} err="failed to get container status \"3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba\": rpc error: code = NotFound desc = could not find container \"3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba\": container with ID starting with 3d87ffecd4f7ff39af5708482a093c031cff8ce7bc1c35b84d1857193dfe65ba not found: ID does not exist" Apr 23 13:55:31.998721 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.998551 2582 scope.go:117] "RemoveContainer" containerID="437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64" Apr 23 13:55:31.998945 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:55:31.998904 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64\": container with ID starting with 437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64 not found: ID does not exist" containerID="437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64" Apr 23 13:55:31.999005 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.998957 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64"} err="failed to get container status \"437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64\": rpc error: code = NotFound desc = could not find container \"437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64\": container with ID starting with 437d259730f30b28cc9801f41c6664f39e9cbee64e51c3d55a2f2fad4f30fe64 not found: ID does not exist" Apr 23 13:55:31.999005 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.998990 2582 scope.go:117] "RemoveContainer" containerID="e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf" Apr 23 13:55:31.999332 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:55:31.999308 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf\": container with ID starting with e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf not found: ID does not exist" containerID="e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf" Apr 23 13:55:31.999383 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:31.999342 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf"} err="failed to get container status \"e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf\": rpc error: code = NotFound desc = could not find container \"e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf\": container with ID starting with e0461a04e4523481b57f92f1a3e3a445ccf1f9f351f994b913081204f78f3ecf not found: ID does not exist" Apr 23 13:55:32.000366 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:32.000347 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-r8b64"] Apr 23 13:55:33.669379 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:33.669347 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" path="/var/lib/kubelet/pods/553e244d-2d11-4a3b-b422-09efbe6c3610/volumes" Apr 23 13:55:33.983608 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:33.983518 2582 generic.go:358] "Generic (PLEG): container finished" podID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerID="4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4" exitCode=0 Apr 23 13:55:33.983753 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:33.983600 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" event={"ID":"0c952a49-dbb7-4899-8e4e-80c16deba78b","Type":"ContainerDied","Data":"4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4"} Apr 23 13:55:34.988978 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:34.988945 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" event={"ID":"0c952a49-dbb7-4899-8e4e-80c16deba78b","Type":"ContainerStarted","Data":"8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191"} Apr 23 13:55:34.988978 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:34.988986 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" event={"ID":"0c952a49-dbb7-4899-8e4e-80c16deba78b","Type":"ContainerStarted","Data":"f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c"} Apr 23 13:55:34.989458 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:34.989331 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:34.989522 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:34.989479 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:34.990865 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:34.990842 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 13:55:35.008679 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:35.008636 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podStartSLOduration=7.008621839 podStartE2EDuration="7.008621839s" podCreationTimestamp="2026-04-23 13:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:55:35.007255259 +0000 UTC m=+1437.994496134" watchObservedRunningTime="2026-04-23 13:55:35.008621839 +0000 UTC m=+1437.995862717" Apr 23 13:55:35.992260 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:35.992223 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 13:55:36.995192 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:36.995148 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 13:55:41.999535 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:41.999502 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:55:42.000123 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:42.000096 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 13:55:52.000390 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:55:52.000347 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 13:56:02.000781 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:02.000738 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 13:56:12.000447 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:12.000397 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 13:56:22.000724 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:22.000692 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:56:30.093462 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:30.093430 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6"] Apr 23 13:56:30.093831 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:30.093736 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" containerID="cri-o://f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c" gracePeriod=30 Apr 23 13:56:30.093831 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:30.093759 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kube-rbac-proxy" containerID="cri-o://8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191" gracePeriod=30 Apr 23 13:56:31.147340 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:31.147302 2582 generic.go:358] "Generic (PLEG): container finished" podID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerID="8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191" exitCode=2 Apr 23 13:56:31.147714 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:31.147376 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" event={"ID":"0c952a49-dbb7-4899-8e4e-80c16deba78b","Type":"ContainerDied","Data":"8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191"} Apr 23 13:56:31.995837 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:31.995794 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 23 13:56:32.000135 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.000103 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 13:56:32.832280 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.832256 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:56:32.969567 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.969485 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-652nk\" (UniqueName: \"kubernetes.io/projected/0c952a49-dbb7-4899-8e4e-80c16deba78b-kube-api-access-652nk\") pod \"0c952a49-dbb7-4899-8e4e-80c16deba78b\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " Apr 23 13:56:32.969567 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.969518 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c952a49-dbb7-4899-8e4e-80c16deba78b-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"0c952a49-dbb7-4899-8e4e-80c16deba78b\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " Apr 23 13:56:32.969567 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.969556 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c952a49-dbb7-4899-8e4e-80c16deba78b-kserve-provision-location\") pod \"0c952a49-dbb7-4899-8e4e-80c16deba78b\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " Apr 23 13:56:32.969828 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.969666 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c952a49-dbb7-4899-8e4e-80c16deba78b-proxy-tls\") pod \"0c952a49-dbb7-4899-8e4e-80c16deba78b\" (UID: \"0c952a49-dbb7-4899-8e4e-80c16deba78b\") " Apr 23 13:56:32.969907 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.969880 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c952a49-dbb7-4899-8e4e-80c16deba78b-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "0c952a49-dbb7-4899-8e4e-80c16deba78b" (UID: "0c952a49-dbb7-4899-8e4e-80c16deba78b"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:56:32.971809 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.971778 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c952a49-dbb7-4899-8e4e-80c16deba78b-kube-api-access-652nk" (OuterVolumeSpecName: "kube-api-access-652nk") pod "0c952a49-dbb7-4899-8e4e-80c16deba78b" (UID: "0c952a49-dbb7-4899-8e4e-80c16deba78b"). InnerVolumeSpecName "kube-api-access-652nk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:56:32.971912 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.971811 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c952a49-dbb7-4899-8e4e-80c16deba78b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0c952a49-dbb7-4899-8e4e-80c16deba78b" (UID: "0c952a49-dbb7-4899-8e4e-80c16deba78b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:56:32.979647 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:32.979623 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c952a49-dbb7-4899-8e4e-80c16deba78b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0c952a49-dbb7-4899-8e4e-80c16deba78b" (UID: "0c952a49-dbb7-4899-8e4e-80c16deba78b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:33.071176 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.071141 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-652nk\" (UniqueName: \"kubernetes.io/projected/0c952a49-dbb7-4899-8e4e-80c16deba78b-kube-api-access-652nk\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:56:33.071176 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.071173 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c952a49-dbb7-4899-8e4e-80c16deba78b-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:56:33.071366 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.071188 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c952a49-dbb7-4899-8e4e-80c16deba78b-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:56:33.071366 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.071200 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c952a49-dbb7-4899-8e4e-80c16deba78b-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:56:33.154652 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.154618 2582 generic.go:358] "Generic (PLEG): container finished" podID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerID="f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c" exitCode=0 Apr 23 13:56:33.154818 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.154687 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" Apr 23 13:56:33.154818 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.154691 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" event={"ID":"0c952a49-dbb7-4899-8e4e-80c16deba78b","Type":"ContainerDied","Data":"f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c"} Apr 23 13:56:33.154818 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.154731 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6" event={"ID":"0c952a49-dbb7-4899-8e4e-80c16deba78b","Type":"ContainerDied","Data":"7c625432c6d8512fc1cbdfdd1a1eb023c02280ae2e6bdc89ed58bfad62e72f49"} Apr 23 13:56:33.154818 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.154747 2582 scope.go:117] "RemoveContainer" containerID="8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191" Apr 23 13:56:33.162902 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.162886 2582 scope.go:117] "RemoveContainer" containerID="f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c" Apr 23 13:56:33.169906 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.169888 2582 scope.go:117] "RemoveContainer" containerID="4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4" Apr 23 13:56:33.176799 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.176774 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6"] Apr 23 13:56:33.177540 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.177525 2582 scope.go:117] "RemoveContainer" containerID="8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191" Apr 23 13:56:33.177779 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:56:33.177764 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191\": container with ID starting with 8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191 not found: ID does not exist" containerID="8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191" Apr 23 13:56:33.177824 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.177787 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191"} err="failed to get container status \"8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191\": rpc error: code = NotFound desc = could not find container \"8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191\": container with ID starting with 8281a148b549e45cd675443316f6d7ad5069ecfdeb75525a4a45f72c7e216191 not found: ID does not exist" Apr 23 13:56:33.177824 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.177804 2582 scope.go:117] "RemoveContainer" containerID="f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c" Apr 23 13:56:33.178008 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:56:33.177993 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c\": container with ID starting with f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c not found: ID does not exist" containerID="f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c" Apr 23 13:56:33.178054 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.178010 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c"} err="failed to get container status \"f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c\": rpc error: code = NotFound desc = could not find container \"f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c\": container with ID starting with f65d7d56d22c60b256ca2205805505abbacdeae1b23231b279145210509fcf2c not found: ID does not exist" Apr 23 13:56:33.178054 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.178022 2582 scope.go:117] "RemoveContainer" containerID="4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4" Apr 23 13:56:33.178226 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:56:33.178202 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4\": container with ID starting with 4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4 not found: ID does not exist" containerID="4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4" Apr 23 13:56:33.178280 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.178235 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4"} err="failed to get container status \"4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4\": rpc error: code = NotFound desc = could not find container \"4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4\": container with ID starting with 4c019954bf4eb93ae5138be80c9590eec7f0abe6582f33fa8a58329f4a7fa7e4 not found: ID does not exist" Apr 23 13:56:33.180398 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.180376 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4jxs6"] Apr 23 13:56:33.669792 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:33.669754 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" path="/var/lib/kubelet/pods/0c952a49-dbb7-4899-8e4e-80c16deba78b/volumes" Apr 23 13:56:37.613897 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:37.613865 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:56:37.614973 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:56:37.614953 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 13:58:11.410983 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.410940 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc"] Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411334 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411348 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411361 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="storage-initializer" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411370 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="storage-initializer" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411392 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="storage-initializer" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411401 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="storage-initializer" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411410 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kube-rbac-proxy" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411418 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kube-rbac-proxy" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411434 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411441 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411453 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kube-rbac-proxy" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411460 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kube-rbac-proxy" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411518 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kserve-container" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411531 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kserve-container" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411539 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c952a49-dbb7-4899-8e4e-80c16deba78b" containerName="kube-rbac-proxy" Apr 23 13:58:11.413348 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.411552 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="553e244d-2d11-4a3b-b422-09efbe6c3610" containerName="kube-rbac-proxy" Apr 23 13:58:11.414671 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.414654 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.417778 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.417754 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 23 13:58:11.417905 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.417754 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 23 13:58:11.418253 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.418236 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:58:11.418969 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.418950 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:58:11.419085 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.418985 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 13:58:11.426853 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.426828 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc"] Apr 23 13:58:11.478377 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.478344 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a35871c-33d1-4275-b918-d622d0fbfc30-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.478529 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.478384 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a35871c-33d1-4275-b918-d622d0fbfc30-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.478529 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.478443 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a35871c-33d1-4275-b918-d622d0fbfc30-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.478529 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.478507 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m9kf\" (UniqueName: \"kubernetes.io/projected/5a35871c-33d1-4275-b918-d622d0fbfc30-kube-api-access-2m9kf\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.578975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.578913 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a35871c-33d1-4275-b918-d622d0fbfc30-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.578975 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.578982 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a35871c-33d1-4275-b918-d622d0fbfc30-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.579261 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.579028 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m9kf\" (UniqueName: \"kubernetes.io/projected/5a35871c-33d1-4275-b918-d622d0fbfc30-kube-api-access-2m9kf\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.579261 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.579052 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a35871c-33d1-4275-b918-d622d0fbfc30-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.579446 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.579422 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a35871c-33d1-4275-b918-d622d0fbfc30-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.579657 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.579640 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a35871c-33d1-4275-b918-d622d0fbfc30-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.581601 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.581583 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a35871c-33d1-4275-b918-d622d0fbfc30-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.595070 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.595036 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m9kf\" (UniqueName: \"kubernetes.io/projected/5a35871c-33d1-4275-b918-d622d0fbfc30-kube-api-access-2m9kf\") pod \"isvc-pmml-runtime-predictor-67bc544947-zqttc\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.725751 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.725661 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:11.850625 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:11.850603 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc"] Apr 23 13:58:11.853113 ip-10-0-139-40 kubenswrapper[2582]: W0423 13:58:11.853084 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a35871c_33d1_4275_b918_d622d0fbfc30.slice/crio-1186ed0b5d2b4ed51758ba3bd6e6e5f67d2c237bc6caaec5e74e5671bfa141d8 WatchSource:0}: Error finding container 1186ed0b5d2b4ed51758ba3bd6e6e5f67d2c237bc6caaec5e74e5671bfa141d8: Status 404 returned error can't find the container with id 1186ed0b5d2b4ed51758ba3bd6e6e5f67d2c237bc6caaec5e74e5671bfa141d8 Apr 23 13:58:12.425172 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:12.425130 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" event={"ID":"5a35871c-33d1-4275-b918-d622d0fbfc30","Type":"ContainerStarted","Data":"cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4"} Apr 23 13:58:12.425172 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:12.425169 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" event={"ID":"5a35871c-33d1-4275-b918-d622d0fbfc30","Type":"ContainerStarted","Data":"1186ed0b5d2b4ed51758ba3bd6e6e5f67d2c237bc6caaec5e74e5671bfa141d8"} Apr 23 13:58:16.438027 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:16.437991 2582 generic.go:358] "Generic (PLEG): container finished" podID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerID="cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4" exitCode=0 Apr 23 13:58:16.438454 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:16.438065 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" event={"ID":"5a35871c-33d1-4275-b918-d622d0fbfc30","Type":"ContainerDied","Data":"cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4"} Apr 23 13:58:23.461765 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:23.461736 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" event={"ID":"5a35871c-33d1-4275-b918-d622d0fbfc30","Type":"ContainerStarted","Data":"39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86"} Apr 23 13:58:24.466603 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:24.466564 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" event={"ID":"5a35871c-33d1-4275-b918-d622d0fbfc30","Type":"ContainerStarted","Data":"fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89"} Apr 23 13:58:24.467098 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:24.466780 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:24.493423 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:24.493373 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podStartSLOduration=6.57824799 podStartE2EDuration="13.493360833s" podCreationTimestamp="2026-04-23 13:58:11 +0000 UTC" firstStartedPulling="2026-04-23 13:58:16.439220671 +0000 UTC m=+1599.426461528" lastFinishedPulling="2026-04-23 13:58:23.354333514 +0000 UTC m=+1606.341574371" observedRunningTime="2026-04-23 13:58:24.492879146 +0000 UTC m=+1607.480120026" watchObservedRunningTime="2026-04-23 13:58:24.493360833 +0000 UTC m=+1607.480601711" Apr 23 13:58:25.469085 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:25.469047 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:25.470341 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:25.470315 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:58:26.471248 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:26.471203 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:58:31.476006 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:31.475978 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:58:31.476674 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:31.476639 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:58:41.477548 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:41.477506 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:58:51.476572 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:58:51.476525 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:59:01.476662 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:01.476620 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:59:11.477460 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:11.477415 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:59:21.476604 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:21.476563 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:59:31.476755 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:31.476713 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 13:59:41.477154 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:41.477117 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:59:52.707134 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:52.707052 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc"] Apr 23 13:59:52.707607 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:52.707463 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" containerID="cri-o://39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86" gracePeriod=30 Apr 23 13:59:52.707607 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:52.707473 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kube-rbac-proxy" containerID="cri-o://fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89" gracePeriod=30 Apr 23 13:59:53.724997 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:53.724960 2582 generic.go:358] "Generic (PLEG): container finished" podID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerID="fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89" exitCode=2 Apr 23 13:59:53.725359 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:53.725023 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" event={"ID":"5a35871c-33d1-4275-b918-d622d0fbfc30","Type":"ContainerDied","Data":"fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89"} Apr 23 13:59:56.452887 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.452862 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:59:56.613851 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.613756 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a35871c-33d1-4275-b918-d622d0fbfc30-proxy-tls\") pod \"5a35871c-33d1-4275-b918-d622d0fbfc30\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " Apr 23 13:59:56.613851 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.613828 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a35871c-33d1-4275-b918-d622d0fbfc30-kserve-provision-location\") pod \"5a35871c-33d1-4275-b918-d622d0fbfc30\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " Apr 23 13:59:56.614101 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.613913 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a35871c-33d1-4275-b918-d622d0fbfc30-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"5a35871c-33d1-4275-b918-d622d0fbfc30\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " Apr 23 13:59:56.614101 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.614001 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m9kf\" (UniqueName: \"kubernetes.io/projected/5a35871c-33d1-4275-b918-d622d0fbfc30-kube-api-access-2m9kf\") pod \"5a35871c-33d1-4275-b918-d622d0fbfc30\" (UID: \"5a35871c-33d1-4275-b918-d622d0fbfc30\") " Apr 23 13:59:56.614210 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.614145 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a35871c-33d1-4275-b918-d622d0fbfc30-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5a35871c-33d1-4275-b918-d622d0fbfc30" (UID: "5a35871c-33d1-4275-b918-d622d0fbfc30"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.614248 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.614219 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a35871c-33d1-4275-b918-d622d0fbfc30-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "5a35871c-33d1-4275-b918-d622d0fbfc30" (UID: "5a35871c-33d1-4275-b918-d622d0fbfc30"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:59:56.616127 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.616103 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a35871c-33d1-4275-b918-d622d0fbfc30-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5a35871c-33d1-4275-b918-d622d0fbfc30" (UID: "5a35871c-33d1-4275-b918-d622d0fbfc30"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:59:56.616214 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.616135 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a35871c-33d1-4275-b918-d622d0fbfc30-kube-api-access-2m9kf" (OuterVolumeSpecName: "kube-api-access-2m9kf") pod "5a35871c-33d1-4275-b918-d622d0fbfc30" (UID: "5a35871c-33d1-4275-b918-d622d0fbfc30"). InnerVolumeSpecName "kube-api-access-2m9kf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:59:56.714729 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.714679 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a35871c-33d1-4275-b918-d622d0fbfc30-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.714729 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.714725 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5a35871c-33d1-4275-b918-d622d0fbfc30-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.714729 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.714737 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2m9kf\" (UniqueName: \"kubernetes.io/projected/5a35871c-33d1-4275-b918-d622d0fbfc30-kube-api-access-2m9kf\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.715003 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.714747 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a35871c-33d1-4275-b918-d622d0fbfc30-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.735010 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.734968 2582 generic.go:358] "Generic (PLEG): container finished" podID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerID="39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86" exitCode=0 Apr 23 13:59:56.735146 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.735052 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" Apr 23 13:59:56.735146 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.735064 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" event={"ID":"5a35871c-33d1-4275-b918-d622d0fbfc30","Type":"ContainerDied","Data":"39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86"} Apr 23 13:59:56.735146 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.735103 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc" event={"ID":"5a35871c-33d1-4275-b918-d622d0fbfc30","Type":"ContainerDied","Data":"1186ed0b5d2b4ed51758ba3bd6e6e5f67d2c237bc6caaec5e74e5671bfa141d8"} Apr 23 13:59:56.735146 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.735120 2582 scope.go:117] "RemoveContainer" containerID="fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89" Apr 23 13:59:56.743106 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.743090 2582 scope.go:117] "RemoveContainer" containerID="39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86" Apr 23 13:59:56.750221 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.750205 2582 scope.go:117] "RemoveContainer" containerID="cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4" Apr 23 13:59:56.758686 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.755818 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc"] Apr 23 13:59:56.759095 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.759022 2582 scope.go:117] "RemoveContainer" containerID="fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89" Apr 23 13:59:56.759762 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:59:56.759734 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89\": container with ID starting with fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89 not found: ID does not exist" containerID="fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89" Apr 23 13:59:56.759855 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.759772 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89"} err="failed to get container status \"fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89\": rpc error: code = NotFound desc = could not find container \"fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89\": container with ID starting with fcf6a8f1ed12d7c080f1342277e81920e3342b4a0ecf944fe600603a0996eb89 not found: ID does not exist" Apr 23 13:59:56.759855 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.759797 2582 scope.go:117] "RemoveContainer" containerID="39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86" Apr 23 13:59:56.760479 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:59:56.760461 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86\": container with ID starting with 39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86 not found: ID does not exist" containerID="39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86" Apr 23 13:59:56.760545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.760487 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86"} err="failed to get container status \"39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86\": rpc error: code = NotFound desc = could not find container \"39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86\": container with ID starting with 39d62ab1fd12ddc789098a3db703982ae5e92853ff6d90256e73abe1e9b21c86 not found: ID does not exist" Apr 23 13:59:56.760545 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.760504 2582 scope.go:117] "RemoveContainer" containerID="cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4" Apr 23 13:59:56.760656 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.760580 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-zqttc"] Apr 23 13:59:56.760781 ip-10-0-139-40 kubenswrapper[2582]: E0423 13:59:56.760766 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4\": container with ID starting with cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4 not found: ID does not exist" containerID="cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4" Apr 23 13:59:56.760824 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:56.760783 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4"} err="failed to get container status \"cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4\": rpc error: code = NotFound desc = could not find container \"cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4\": container with ID starting with cfd1ea2f341e45f093b30f79d343ba3d715ba477ea93cd22f84a0b68719408f4 not found: ID does not exist" Apr 23 13:59:57.669450 ip-10-0-139-40 kubenswrapper[2582]: I0423 13:59:57.669416 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" path="/var/lib/kubelet/pods/5a35871c-33d1-4275-b918-d622d0fbfc30/volumes" Apr 23 14:01:37.631212 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:01:37.631187 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:01:37.633619 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:01:37.633600 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:03:11.108852 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.108822 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2"] Apr 23 14:03:11.109309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.109091 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kube-rbac-proxy" Apr 23 14:03:11.109309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.109102 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kube-rbac-proxy" Apr 23 14:03:11.109309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.109114 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" Apr 23 14:03:11.109309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.109121 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" Apr 23 14:03:11.109309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.109130 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="storage-initializer" Apr 23 14:03:11.109309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.109135 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="storage-initializer" Apr 23 14:03:11.109309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.109188 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kserve-container" Apr 23 14:03:11.109309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.109196 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a35871c-33d1-4275-b918-d622d0fbfc30" containerName="kube-rbac-proxy" Apr 23 14:03:11.112085 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.112058 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.114706 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.114686 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 23 14:03:11.114996 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.114967 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:03:11.115813 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.115793 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:03:11.115865 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.115794 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:03:11.115865 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.115794 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 23 14:03:11.124783 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.124068 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2"] Apr 23 14:03:11.149859 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.149832 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd345d84-005d-4913-af25-a356ae5f852a-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.150000 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.149867 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4bf\" (UniqueName: \"kubernetes.io/projected/fd345d84-005d-4913-af25-a356ae5f852a-kube-api-access-vd4bf\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.150000 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.149893 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd345d84-005d-4913-af25-a356ae5f852a-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.150000 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.149992 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd345d84-005d-4913-af25-a356ae5f852a-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.250671 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.250635 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd345d84-005d-4913-af25-a356ae5f852a-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.250831 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.250685 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd345d84-005d-4913-af25-a356ae5f852a-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.250831 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.250720 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd345d84-005d-4913-af25-a356ae5f852a-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.250942 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.250879 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4bf\" (UniqueName: \"kubernetes.io/projected/fd345d84-005d-4913-af25-a356ae5f852a-kube-api-access-vd4bf\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.251163 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.251142 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd345d84-005d-4913-af25-a356ae5f852a-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.251461 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.251439 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd345d84-005d-4913-af25-a356ae5f852a-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.253299 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.253280 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd345d84-005d-4913-af25-a356ae5f852a-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.260364 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.260338 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4bf\" (UniqueName: \"kubernetes.io/projected/fd345d84-005d-4913-af25-a356ae5f852a-kube-api-access-vd4bf\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.424491 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.424384 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:11.545765 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.545732 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2"] Apr 23 14:03:11.548883 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:03:11.548852 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd345d84_005d_4913_af25_a356ae5f852a.slice/crio-07e40d037678b2a71e44aa6eaa6e0c1564c08e221297a4432d381a7bcab301d7 WatchSource:0}: Error finding container 07e40d037678b2a71e44aa6eaa6e0c1564c08e221297a4432d381a7bcab301d7: Status 404 returned error can't find the container with id 07e40d037678b2a71e44aa6eaa6e0c1564c08e221297a4432d381a7bcab301d7 Apr 23 14:03:11.551083 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:11.551064 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:03:12.266357 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:12.266319 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" event={"ID":"fd345d84-005d-4913-af25-a356ae5f852a","Type":"ContainerStarted","Data":"4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a"} Apr 23 14:03:12.266706 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:12.266363 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" event={"ID":"fd345d84-005d-4913-af25-a356ae5f852a","Type":"ContainerStarted","Data":"07e40d037678b2a71e44aa6eaa6e0c1564c08e221297a4432d381a7bcab301d7"} Apr 23 14:03:15.275261 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:15.275228 2582 generic.go:358] "Generic (PLEG): container finished" podID="fd345d84-005d-4913-af25-a356ae5f852a" containerID="4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a" exitCode=0 Apr 23 14:03:15.275561 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:15.275283 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" event={"ID":"fd345d84-005d-4913-af25-a356ae5f852a","Type":"ContainerDied","Data":"4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a"} Apr 23 14:03:39.350190 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:39.350156 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" event={"ID":"fd345d84-005d-4913-af25-a356ae5f852a","Type":"ContainerStarted","Data":"441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8"} Apr 23 14:03:39.350190 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:39.350194 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" event={"ID":"fd345d84-005d-4913-af25-a356ae5f852a","Type":"ContainerStarted","Data":"ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0"} Apr 23 14:03:39.350627 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:39.350469 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:39.350627 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:39.350589 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:39.351881 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:39.351853 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:03:39.373057 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:39.373016 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podStartSLOduration=5.080269002 podStartE2EDuration="28.373004881s" podCreationTimestamp="2026-04-23 14:03:11 +0000 UTC" firstStartedPulling="2026-04-23 14:03:15.276382286 +0000 UTC m=+1898.263623144" lastFinishedPulling="2026-04-23 14:03:38.569118165 +0000 UTC m=+1921.556359023" observedRunningTime="2026-04-23 14:03:39.371517499 +0000 UTC m=+1922.358758378" watchObservedRunningTime="2026-04-23 14:03:39.373004881 +0000 UTC m=+1922.360245760" Apr 23 14:03:40.357801 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:40.353445 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:03:45.359245 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:45.359216 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:03:45.359828 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:45.359801 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:03:55.360158 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:03:55.360121 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:04:05.360053 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:04:05.360012 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:04:15.359721 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:04:15.359676 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:04:25.359795 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:04:25.359750 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:04:35.360267 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:04:35.360228 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:04:45.360725 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:04:45.360685 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 14:04:55.361096 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:04:55.361055 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:05:00.786817 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.786781 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2"] Apr 23 14:05:00.787385 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.787136 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kube-rbac-proxy" containerID="cri-o://441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8" gracePeriod=30 Apr 23 14:05:00.787385 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.787125 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" containerID="cri-o://ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0" gracePeriod=30 Apr 23 14:05:00.893979 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.893940 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk"] Apr 23 14:05:00.896362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.896340 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:00.899078 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.899054 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 23 14:05:00.899196 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.899094 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 23 14:05:00.908217 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.908191 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk"] Apr 23 14:05:00.980604 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.980571 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w9xq\" (UniqueName: \"kubernetes.io/projected/59e582ab-4107-4f8b-89c0-011c1775c94e-kube-api-access-9w9xq\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:00.980742 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.980614 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e582ab-4107-4f8b-89c0-011c1775c94e-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:00.980742 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.980671 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e582ab-4107-4f8b-89c0-011c1775c94e-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:00.980742 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:00.980686 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e582ab-4107-4f8b-89c0-011c1775c94e-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.082119 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.082036 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9w9xq\" (UniqueName: \"kubernetes.io/projected/59e582ab-4107-4f8b-89c0-011c1775c94e-kube-api-access-9w9xq\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.082119 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.082079 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e582ab-4107-4f8b-89c0-011c1775c94e-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.082362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.082141 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e582ab-4107-4f8b-89c0-011c1775c94e-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.082362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.082167 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e582ab-4107-4f8b-89c0-011c1775c94e-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.082681 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.082661 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e582ab-4107-4f8b-89c0-011c1775c94e-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.082823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.082797 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e582ab-4107-4f8b-89c0-011c1775c94e-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.084823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.084804 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e582ab-4107-4f8b-89c0-011c1775c94e-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.091208 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.091181 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w9xq\" (UniqueName: \"kubernetes.io/projected/59e582ab-4107-4f8b-89c0-011c1775c94e-kube-api-access-9w9xq\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.206267 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.206229 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:01.330323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.330288 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk"] Apr 23 14:05:01.334082 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:05:01.334029 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e582ab_4107_4f8b_89c0_011c1775c94e.slice/crio-132ffcee4f5ba3993408689ee99044dfc839b203840405bb77f9aea06170ed09 WatchSource:0}: Error finding container 132ffcee4f5ba3993408689ee99044dfc839b203840405bb77f9aea06170ed09: Status 404 returned error can't find the container with id 132ffcee4f5ba3993408689ee99044dfc839b203840405bb77f9aea06170ed09 Apr 23 14:05:01.567523 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.567486 2582 generic.go:358] "Generic (PLEG): container finished" podID="fd345d84-005d-4913-af25-a356ae5f852a" containerID="441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8" exitCode=2 Apr 23 14:05:01.567695 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.567572 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" event={"ID":"fd345d84-005d-4913-af25-a356ae5f852a","Type":"ContainerDied","Data":"441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8"} Apr 23 14:05:01.568841 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.568820 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" event={"ID":"59e582ab-4107-4f8b-89c0-011c1775c94e","Type":"ContainerStarted","Data":"c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d"} Apr 23 14:05:01.568984 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:01.568845 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" event={"ID":"59e582ab-4107-4f8b-89c0-011c1775c94e","Type":"ContainerStarted","Data":"132ffcee4f5ba3993408689ee99044dfc839b203840405bb77f9aea06170ed09"} Apr 23 14:05:05.427726 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.427702 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:05:05.514338 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.514243 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4bf\" (UniqueName: \"kubernetes.io/projected/fd345d84-005d-4913-af25-a356ae5f852a-kube-api-access-vd4bf\") pod \"fd345d84-005d-4913-af25-a356ae5f852a\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " Apr 23 14:05:05.514338 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.514302 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd345d84-005d-4913-af25-a356ae5f852a-kserve-provision-location\") pod \"fd345d84-005d-4913-af25-a356ae5f852a\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " Apr 23 14:05:05.514338 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.514339 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd345d84-005d-4913-af25-a356ae5f852a-proxy-tls\") pod \"fd345d84-005d-4913-af25-a356ae5f852a\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " Apr 23 14:05:05.514627 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.514382 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd345d84-005d-4913-af25-a356ae5f852a-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"fd345d84-005d-4913-af25-a356ae5f852a\" (UID: \"fd345d84-005d-4913-af25-a356ae5f852a\") " Apr 23 14:05:05.514712 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.514685 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd345d84-005d-4913-af25-a356ae5f852a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fd345d84-005d-4913-af25-a356ae5f852a" (UID: "fd345d84-005d-4913-af25-a356ae5f852a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:05:05.514830 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.514805 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd345d84-005d-4913-af25-a356ae5f852a-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "fd345d84-005d-4913-af25-a356ae5f852a" (UID: "fd345d84-005d-4913-af25-a356ae5f852a"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:05:05.516491 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.516471 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd345d84-005d-4913-af25-a356ae5f852a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fd345d84-005d-4913-af25-a356ae5f852a" (UID: "fd345d84-005d-4913-af25-a356ae5f852a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:05:05.516560 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.516471 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd345d84-005d-4913-af25-a356ae5f852a-kube-api-access-vd4bf" (OuterVolumeSpecName: "kube-api-access-vd4bf") pod "fd345d84-005d-4913-af25-a356ae5f852a" (UID: "fd345d84-005d-4913-af25-a356ae5f852a"). InnerVolumeSpecName "kube-api-access-vd4bf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:05:05.583202 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.583173 2582 generic.go:358] "Generic (PLEG): container finished" podID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerID="c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d" exitCode=0 Apr 23 14:05:05.583332 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.583249 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" event={"ID":"59e582ab-4107-4f8b-89c0-011c1775c94e","Type":"ContainerDied","Data":"c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d"} Apr 23 14:05:05.585113 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.585092 2582 generic.go:358] "Generic (PLEG): container finished" podID="fd345d84-005d-4913-af25-a356ae5f852a" containerID="ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0" exitCode=0 Apr 23 14:05:05.585201 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.585146 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" event={"ID":"fd345d84-005d-4913-af25-a356ae5f852a","Type":"ContainerDied","Data":"ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0"} Apr 23 14:05:05.585201 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.585176 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" Apr 23 14:05:05.585201 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.585191 2582 scope.go:117] "RemoveContainer" containerID="441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8" Apr 23 14:05:05.585308 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.585178 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" event={"ID":"fd345d84-005d-4913-af25-a356ae5f852a","Type":"ContainerDied","Data":"07e40d037678b2a71e44aa6eaa6e0c1564c08e221297a4432d381a7bcab301d7"} Apr 23 14:05:05.595271 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.595255 2582 scope.go:117] "RemoveContainer" containerID="ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0" Apr 23 14:05:05.603350 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.603332 2582 scope.go:117] "RemoveContainer" containerID="4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a" Apr 23 14:05:05.613560 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.613540 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2"] Apr 23 14:05:05.615257 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.615239 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd345d84-005d-4913-af25-a356ae5f852a-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:05:05.615323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.615261 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd345d84-005d-4913-af25-a356ae5f852a-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:05:05.615323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.615279 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd345d84-005d-4913-af25-a356ae5f852a-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:05:05.615323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.615295 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vd4bf\" (UniqueName: \"kubernetes.io/projected/fd345d84-005d-4913-af25-a356ae5f852a-kube-api-access-vd4bf\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:05:05.617547 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.617528 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2"] Apr 23 14:05:05.623066 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.622124 2582 scope.go:117] "RemoveContainer" containerID="441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8" Apr 23 14:05:05.623650 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:05:05.623619 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8\": container with ID starting with 441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8 not found: ID does not exist" containerID="441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8" Apr 23 14:05:05.623744 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.623654 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8"} err="failed to get container status \"441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8\": rpc error: code = NotFound desc = could not find container \"441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8\": container with ID starting with 441a71bc2d54ceae012bf4a63f9978f7a04ce4e15b6201e28c503ee771596ea8 not found: ID does not exist" Apr 23 14:05:05.623744 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.623674 2582 scope.go:117] "RemoveContainer" containerID="ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0" Apr 23 14:05:05.624002 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:05:05.623959 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0\": container with ID starting with ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0 not found: ID does not exist" containerID="ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0" Apr 23 14:05:05.624002 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.623987 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0"} err="failed to get container status \"ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0\": rpc error: code = NotFound desc = could not find container \"ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0\": container with ID starting with ccfdb323d471ec0eda420caff48aabacf3b2758d6d928b7561eedb91257fa7d0 not found: ID does not exist" Apr 23 14:05:05.624152 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.624012 2582 scope.go:117] "RemoveContainer" containerID="4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a" Apr 23 14:05:05.624287 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:05:05.624270 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a\": container with ID starting with 4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a not found: ID does not exist" containerID="4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a" Apr 23 14:05:05.624357 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.624291 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a"} err="failed to get container status \"4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a\": rpc error: code = NotFound desc = could not find container \"4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a\": container with ID starting with 4d04433283b3daa8b1c0fcccd721e5486a224c1ae27bc9cb3891aadcdb17821a not found: ID does not exist" Apr 23 14:05:05.669847 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:05.669822 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd345d84-005d-4913-af25-a356ae5f852a" path="/var/lib/kubelet/pods/fd345d84-005d-4913-af25-a356ae5f852a/volumes" Apr 23 14:05:06.354498 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:06.354448 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 23 14:05:06.360640 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:06.360610 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-l8pf2" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: i/o timeout" Apr 23 14:05:06.590503 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:06.590467 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" event={"ID":"59e582ab-4107-4f8b-89c0-011c1775c94e","Type":"ContainerStarted","Data":"ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f"} Apr 23 14:05:06.590856 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:06.590511 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" event={"ID":"59e582ab-4107-4f8b-89c0-011c1775c94e","Type":"ContainerStarted","Data":"2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c"} Apr 23 14:05:06.590856 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:06.590721 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:06.610485 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:06.610381 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podStartSLOduration=6.610366447 podStartE2EDuration="6.610366447s" podCreationTimestamp="2026-04-23 14:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:05:06.609532397 +0000 UTC m=+2009.596773275" watchObservedRunningTime="2026-04-23 14:05:06.610366447 +0000 UTC m=+2009.597607325" Apr 23 14:05:07.593970 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:07.593931 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:07.595204 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:07.595178 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:05:08.597055 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:08.597009 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:05:13.601675 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:13.601645 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:05:13.602262 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:13.602237 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:05:23.602778 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:23.602731 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:05:33.602817 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:33.602775 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:05:43.602726 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:43.602684 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:05:53.602244 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:05:53.602151 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:06:03.603151 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:03.603107 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:06:13.603060 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:13.603014 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:06:23.603622 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:23.603593 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:06:30.994965 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:30.994931 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk"] Apr 23 14:06:30.995373 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:30.995250 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" containerID="cri-o://2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c" gracePeriod=30 Apr 23 14:06:30.995373 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:30.995332 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kube-rbac-proxy" containerID="cri-o://ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f" gracePeriod=30 Apr 23 14:06:31.114477 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114447 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm"] Apr 23 14:06:31.114779 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114763 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" Apr 23 14:06:31.114856 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114782 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" Apr 23 14:06:31.114856 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114795 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kube-rbac-proxy" Apr 23 14:06:31.114856 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114803 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kube-rbac-proxy" Apr 23 14:06:31.114856 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114814 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="storage-initializer" Apr 23 14:06:31.114856 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114823 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="storage-initializer" Apr 23 14:06:31.115200 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114912 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kserve-container" Apr 23 14:06:31.115200 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.114944 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd345d84-005d-4913-af25-a356ae5f852a" containerName="kube-rbac-proxy" Apr 23 14:06:31.121695 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.121671 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.124811 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.124792 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 23 14:06:31.124931 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.124799 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 23 14:06:31.127587 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.127556 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm"] Apr 23 14:06:31.236608 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.236577 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8j2f\" (UniqueName: \"kubernetes.io/projected/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kube-api-access-z8j2f\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.236608 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.236621 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.236823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.236646 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.236823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.236728 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.337928 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.337834 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.337928 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.337881 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8j2f\" (UniqueName: \"kubernetes.io/projected/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kube-api-access-z8j2f\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.337928 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.337908 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.338210 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.337941 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.338210 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:06:31.338047 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-serving-cert: secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 23 14:06:31.338210 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:06:31.338118 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls podName:4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773 nodeName:}" failed. No retries permitted until 2026-04-23 14:06:31.838098026 +0000 UTC m=+2094.825338885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls") pod "isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" (UID: "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773") : secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 23 14:06:31.338395 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.338370 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.338714 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.338697 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.346625 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.346606 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8j2f\" (UniqueName: \"kubernetes.io/projected/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kube-api-access-z8j2f\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.836238 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.836202 2582 generic.go:358] "Generic (PLEG): container finished" podID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerID="ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f" exitCode=2 Apr 23 14:06:31.836408 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.836248 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" event={"ID":"59e582ab-4107-4f8b-89c0-011c1775c94e","Type":"ContainerDied","Data":"ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f"} Apr 23 14:06:31.841423 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.841395 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:31.843829 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:31.843806 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:32.032174 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:32.032136 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:32.151196 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:32.151169 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm"] Apr 23 14:06:32.153654 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:06:32.153620 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f627ee1_0d9f_4eb5_8e78_b6b56b7e8773.slice/crio-d512deacf7aeb4bfdce1d3d6ea901128140f819e530b68c00d3063c001ab2b54 WatchSource:0}: Error finding container d512deacf7aeb4bfdce1d3d6ea901128140f819e530b68c00d3063c001ab2b54: Status 404 returned error can't find the container with id d512deacf7aeb4bfdce1d3d6ea901128140f819e530b68c00d3063c001ab2b54 Apr 23 14:06:32.840024 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:32.839991 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" event={"ID":"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773","Type":"ContainerStarted","Data":"0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89"} Apr 23 14:06:32.840195 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:32.840031 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" event={"ID":"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773","Type":"ContainerStarted","Data":"d512deacf7aeb4bfdce1d3d6ea901128140f819e530b68c00d3063c001ab2b54"} Apr 23 14:06:33.597748 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:33.597697 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 23 14:06:33.602592 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:33.602567 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 14:06:35.546329 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.546307 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:06:35.668622 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.668601 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w9xq\" (UniqueName: \"kubernetes.io/projected/59e582ab-4107-4f8b-89c0-011c1775c94e-kube-api-access-9w9xq\") pod \"59e582ab-4107-4f8b-89c0-011c1775c94e\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " Apr 23 14:06:35.668824 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.668651 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e582ab-4107-4f8b-89c0-011c1775c94e-proxy-tls\") pod \"59e582ab-4107-4f8b-89c0-011c1775c94e\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " Apr 23 14:06:35.668824 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.668681 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e582ab-4107-4f8b-89c0-011c1775c94e-kserve-provision-location\") pod \"59e582ab-4107-4f8b-89c0-011c1775c94e\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " Apr 23 14:06:35.668824 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.668789 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e582ab-4107-4f8b-89c0-011c1775c94e-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"59e582ab-4107-4f8b-89c0-011c1775c94e\" (UID: \"59e582ab-4107-4f8b-89c0-011c1775c94e\") " Apr 23 14:06:35.669073 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.669047 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e582ab-4107-4f8b-89c0-011c1775c94e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59e582ab-4107-4f8b-89c0-011c1775c94e" (UID: "59e582ab-4107-4f8b-89c0-011c1775c94e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:06:35.669218 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.669162 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e582ab-4107-4f8b-89c0-011c1775c94e-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "59e582ab-4107-4f8b-89c0-011c1775c94e" (UID: "59e582ab-4107-4f8b-89c0-011c1775c94e"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:06:35.671376 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.671354 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e582ab-4107-4f8b-89c0-011c1775c94e-kube-api-access-9w9xq" (OuterVolumeSpecName: "kube-api-access-9w9xq") pod "59e582ab-4107-4f8b-89c0-011c1775c94e" (UID: "59e582ab-4107-4f8b-89c0-011c1775c94e"). InnerVolumeSpecName "kube-api-access-9w9xq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:06:35.671465 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.671398 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e582ab-4107-4f8b-89c0-011c1775c94e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "59e582ab-4107-4f8b-89c0-011c1775c94e" (UID: "59e582ab-4107-4f8b-89c0-011c1775c94e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:06:35.770020 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.769993 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e582ab-4107-4f8b-89c0-011c1775c94e-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:06:35.770020 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.770016 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e582ab-4107-4f8b-89c0-011c1775c94e-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:06:35.770020 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.770026 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e582ab-4107-4f8b-89c0-011c1775c94e-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:06:35.770217 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.770036 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9w9xq\" (UniqueName: \"kubernetes.io/projected/59e582ab-4107-4f8b-89c0-011c1775c94e-kube-api-access-9w9xq\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:06:35.849477 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.849451 2582 generic.go:358] "Generic (PLEG): container finished" podID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerID="0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89" exitCode=0 Apr 23 14:06:35.849605 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.849530 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" event={"ID":"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773","Type":"ContainerDied","Data":"0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89"} Apr 23 14:06:35.851341 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.851315 2582 generic.go:358] "Generic (PLEG): container finished" podID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerID="2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c" exitCode=0 Apr 23 14:06:35.851447 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.851351 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" event={"ID":"59e582ab-4107-4f8b-89c0-011c1775c94e","Type":"ContainerDied","Data":"2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c"} Apr 23 14:06:35.851447 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.851377 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" event={"ID":"59e582ab-4107-4f8b-89c0-011c1775c94e","Type":"ContainerDied","Data":"132ffcee4f5ba3993408689ee99044dfc839b203840405bb77f9aea06170ed09"} Apr 23 14:06:35.851447 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.851396 2582 scope.go:117] "RemoveContainer" containerID="ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f" Apr 23 14:06:35.851447 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.851404 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk" Apr 23 14:06:35.859123 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.859108 2582 scope.go:117] "RemoveContainer" containerID="2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c" Apr 23 14:06:35.866131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.866113 2582 scope.go:117] "RemoveContainer" containerID="c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d" Apr 23 14:06:35.873205 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.873189 2582 scope.go:117] "RemoveContainer" containerID="ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f" Apr 23 14:06:35.873465 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:06:35.873449 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f\": container with ID starting with ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f not found: ID does not exist" containerID="ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f" Apr 23 14:06:35.873530 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.873471 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f"} err="failed to get container status \"ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f\": rpc error: code = NotFound desc = could not find container \"ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f\": container with ID starting with ccc547525003f39fdaac33c84e341aad6a19c1196e924438e854261103780d1f not found: ID does not exist" Apr 23 14:06:35.873530 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.873486 2582 scope.go:117] "RemoveContainer" containerID="2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c" Apr 23 14:06:35.873709 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:06:35.873692 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c\": container with ID starting with 2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c not found: ID does not exist" containerID="2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c" Apr 23 14:06:35.873753 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.873713 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c"} err="failed to get container status \"2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c\": rpc error: code = NotFound desc = could not find container \"2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c\": container with ID starting with 2e5359c6b4ce38755add1bcc5b7c89a92aae665f3c63a1c1973209148331cc6c not found: ID does not exist" Apr 23 14:06:35.873753 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.873727 2582 scope.go:117] "RemoveContainer" containerID="c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d" Apr 23 14:06:35.873960 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:06:35.873936 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d\": container with ID starting with c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d not found: ID does not exist" containerID="c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d" Apr 23 14:06:35.874025 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.873970 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d"} err="failed to get container status \"c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d\": rpc error: code = NotFound desc = could not find container \"c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d\": container with ID starting with c53b3fc0d576199214df7038ae89ccf7ccb7b12ec600c0d0633b92fae5ee4b3d not found: ID does not exist" Apr 23 14:06:35.879798 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.879779 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk"] Apr 23 14:06:35.884555 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:35.884535 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-s6wvk"] Apr 23 14:06:36.857038 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:36.857003 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" event={"ID":"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773","Type":"ContainerStarted","Data":"79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b"} Apr 23 14:06:36.857038 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:36.857040 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" event={"ID":"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773","Type":"ContainerStarted","Data":"062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8"} Apr 23 14:06:36.857451 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:36.857372 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:36.877809 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:36.877760 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podStartSLOduration=5.8777469 podStartE2EDuration="5.8777469s" podCreationTimestamp="2026-04-23 14:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:06:36.87576572 +0000 UTC m=+2099.863006598" watchObservedRunningTime="2026-04-23 14:06:36.8777469 +0000 UTC m=+2099.864987779" Apr 23 14:06:37.651238 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:37.651208 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:06:37.654021 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:37.654002 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:06:37.669264 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:37.669225 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" path="/var/lib/kubelet/pods/59e582ab-4107-4f8b-89c0-011c1775c94e/volumes" Apr 23 14:06:37.859792 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:37.859761 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:37.861122 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:37.861093 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:06:38.861870 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:38.861829 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:06:43.867061 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:43.867033 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:06:43.867598 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:43.867567 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:06:53.867668 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:06:53.867629 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:07:03.867752 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:07:03.867708 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:07:13.867810 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:07:13.867771 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:07:23.868578 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:07:23.868488 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:07:33.868350 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:07:33.868310 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:07:43.868620 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:07:43.868583 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:07:53.869082 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:07:53.869051 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:08:01.376195 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.376162 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm"] Apr 23 14:08:01.376790 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.376578 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" containerID="cri-o://062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8" gracePeriod=30 Apr 23 14:08:01.376790 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.376661 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kube-rbac-proxy" containerID="cri-o://79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b" gracePeriod=30 Apr 23 14:08:01.484869 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.484834 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j"] Apr 23 14:08:01.485218 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.485201 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kube-rbac-proxy" Apr 23 14:08:01.485362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.485221 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kube-rbac-proxy" Apr 23 14:08:01.485362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.485251 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" Apr 23 14:08:01.485362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.485260 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" Apr 23 14:08:01.485362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.485273 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="storage-initializer" Apr 23 14:08:01.485362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.485282 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="storage-initializer" Apr 23 14:08:01.485362 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.485363 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kube-rbac-proxy" Apr 23 14:08:01.485673 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.485375 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="59e582ab-4107-4f8b-89c0-011c1775c94e" containerName="kserve-container" Apr 23 14:08:01.487880 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.487859 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.490223 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.490204 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 23 14:08:01.490330 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.490310 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 23 14:08:01.498967 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.498909 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j"] Apr 23 14:08:01.603678 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.603636 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb08a251-9ca6-4799-90f3-54f489119044-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.603678 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.603679 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb08a251-9ca6-4799-90f3-54f489119044-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.603986 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.603700 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08a251-9ca6-4799-90f3-54f489119044-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.603986 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.603824 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl2rr\" (UniqueName: \"kubernetes.io/projected/cb08a251-9ca6-4799-90f3-54f489119044-kube-api-access-wl2rr\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.705196 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.705156 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wl2rr\" (UniqueName: \"kubernetes.io/projected/cb08a251-9ca6-4799-90f3-54f489119044-kube-api-access-wl2rr\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.705387 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.705335 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb08a251-9ca6-4799-90f3-54f489119044-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.705387 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.705377 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb08a251-9ca6-4799-90f3-54f489119044-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.705523 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.705411 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08a251-9ca6-4799-90f3-54f489119044-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.705836 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.705820 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08a251-9ca6-4799-90f3-54f489119044-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.706168 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.706142 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb08a251-9ca6-4799-90f3-54f489119044-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.708110 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.708091 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb08a251-9ca6-4799-90f3-54f489119044-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.714470 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.714443 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl2rr\" (UniqueName: \"kubernetes.io/projected/cb08a251-9ca6-4799-90f3-54f489119044-kube-api-access-wl2rr\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.797462 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.797418 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:01.927964 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:01.927913 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j"] Apr 23 14:08:01.930161 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:08:01.930121 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb08a251_9ca6_4799_90f3_54f489119044.slice/crio-fa05c404f172aeaff8e544a7e5eb053ae9d340586849f370ec744b7a2893dca3 WatchSource:0}: Error finding container fa05c404f172aeaff8e544a7e5eb053ae9d340586849f370ec744b7a2893dca3: Status 404 returned error can't find the container with id fa05c404f172aeaff8e544a7e5eb053ae9d340586849f370ec744b7a2893dca3 Apr 23 14:08:02.092080 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:02.092040 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" event={"ID":"cb08a251-9ca6-4799-90f3-54f489119044","Type":"ContainerStarted","Data":"638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4"} Apr 23 14:08:02.092285 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:02.092091 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" event={"ID":"cb08a251-9ca6-4799-90f3-54f489119044","Type":"ContainerStarted","Data":"fa05c404f172aeaff8e544a7e5eb053ae9d340586849f370ec744b7a2893dca3"} Apr 23 14:08:02.094012 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:02.093977 2582 generic.go:358] "Generic (PLEG): container finished" podID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerID="79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b" exitCode=2 Apr 23 14:08:02.094139 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:02.094025 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" event={"ID":"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773","Type":"ContainerDied","Data":"79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b"} Apr 23 14:08:03.862518 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:03.862475 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 23 14:08:03.867985 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:03.867951 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 14:08:06.107608 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:06.107573 2582 generic.go:358] "Generic (PLEG): container finished" podID="cb08a251-9ca6-4799-90f3-54f489119044" containerID="638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4" exitCode=0 Apr 23 14:08:06.107987 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:06.107623 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" event={"ID":"cb08a251-9ca6-4799-90f3-54f489119044","Type":"ContainerDied","Data":"638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4"} Apr 23 14:08:06.917040 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:06.917012 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:08:07.051066 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.050984 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8j2f\" (UniqueName: \"kubernetes.io/projected/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kube-api-access-z8j2f\") pod \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " Apr 23 14:08:07.051066 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.051032 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " Apr 23 14:08:07.051066 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.051055 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls\") pod \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " Apr 23 14:08:07.051355 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.051076 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kserve-provision-location\") pod \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\" (UID: \"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773\") " Apr 23 14:08:07.051455 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.051415 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" (UID: "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:08:07.051455 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.051435 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" (UID: "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:08:07.053346 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.053324 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kube-api-access-z8j2f" (OuterVolumeSpecName: "kube-api-access-z8j2f") pod "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" (UID: "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773"). InnerVolumeSpecName "kube-api-access-z8j2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:08:07.053346 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.053330 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" (UID: "4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:08:07.112663 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.112628 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" event={"ID":"cb08a251-9ca6-4799-90f3-54f489119044","Type":"ContainerStarted","Data":"a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8"} Apr 23 14:08:07.112663 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.112662 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" event={"ID":"cb08a251-9ca6-4799-90f3-54f489119044","Type":"ContainerStarted","Data":"66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160"} Apr 23 14:08:07.113190 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.112886 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:07.114222 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.114200 2582 generic.go:358] "Generic (PLEG): container finished" podID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerID="062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8" exitCode=0 Apr 23 14:08:07.114311 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.114262 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" event={"ID":"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773","Type":"ContainerDied","Data":"062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8"} Apr 23 14:08:07.114311 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.114269 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" Apr 23 14:08:07.114311 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.114284 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm" event={"ID":"4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773","Type":"ContainerDied","Data":"d512deacf7aeb4bfdce1d3d6ea901128140f819e530b68c00d3063c001ab2b54"} Apr 23 14:08:07.114311 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.114299 2582 scope.go:117] "RemoveContainer" containerID="79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b" Apr 23 14:08:07.122763 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.122742 2582 scope.go:117] "RemoveContainer" containerID="062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8" Apr 23 14:08:07.129680 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.129663 2582 scope.go:117] "RemoveContainer" containerID="0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89" Apr 23 14:08:07.134155 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.134101 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podStartSLOduration=6.134083906 podStartE2EDuration="6.134083906s" podCreationTimestamp="2026-04-23 14:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:08:07.132586764 +0000 UTC m=+2190.119827657" watchObservedRunningTime="2026-04-23 14:08:07.134083906 +0000 UTC m=+2190.121324786" Apr 23 14:08:07.137547 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.137507 2582 scope.go:117] "RemoveContainer" containerID="79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b" Apr 23 14:08:07.137773 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:08:07.137753 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b\": container with ID starting with 79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b not found: ID does not exist" containerID="79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b" Apr 23 14:08:07.137845 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.137779 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b"} err="failed to get container status \"79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b\": rpc error: code = NotFound desc = could not find container \"79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b\": container with ID starting with 79856b824e37bccab8c6411052355fba255b48b7db94ce96e880d85d43ff7f7b not found: ID does not exist" Apr 23 14:08:07.137845 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.137796 2582 scope.go:117] "RemoveContainer" containerID="062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8" Apr 23 14:08:07.138024 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:08:07.138004 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8\": container with ID starting with 062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8 not found: ID does not exist" containerID="062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8" Apr 23 14:08:07.138094 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.138029 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8"} err="failed to get container status \"062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8\": rpc error: code = NotFound desc = could not find container \"062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8\": container with ID starting with 062fda17baaa3de835f8cf2cb61f35fabb98dabd91741a62c7ef390125e8b9d8 not found: ID does not exist" Apr 23 14:08:07.138094 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.138046 2582 scope.go:117] "RemoveContainer" containerID="0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89" Apr 23 14:08:07.138309 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:08:07.138292 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89\": container with ID starting with 0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89 not found: ID does not exist" containerID="0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89" Apr 23 14:08:07.138371 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.138311 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89"} err="failed to get container status \"0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89\": rpc error: code = NotFound desc = could not find container \"0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89\": container with ID starting with 0b722c7ea9b3b92d59b508d69b58c710093813a3014d57fce4e9fe2b52f1ee89 not found: ID does not exist" Apr 23 14:08:07.145641 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.145616 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm"] Apr 23 14:08:07.150014 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.149995 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mscvm"] Apr 23 14:08:07.152397 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.152367 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8j2f\" (UniqueName: \"kubernetes.io/projected/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kube-api-access-z8j2f\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:08:07.152397 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.152388 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:08:07.152572 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.152404 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:08:07.152572 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.152415 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:08:07.672276 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:07.672240 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" path="/var/lib/kubelet/pods/4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773/volumes" Apr 23 14:08:08.118268 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:08.118184 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:14.127198 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:14.127165 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:08:44.128208 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:44.128168 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 14:08:54.128034 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:08:54.127941 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 14:09:04.128294 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:04.128249 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 14:09:14.128232 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:14.128186 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 14:09:24.131320 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:24.131280 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:09:31.579280 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.579244 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j"] Apr 23 14:09:31.579979 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.579628 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" containerID="cri-o://66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160" gracePeriod=30 Apr 23 14:09:31.579979 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.579685 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kube-rbac-proxy" containerID="cri-o://a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8" gracePeriod=30 Apr 23 14:09:31.695842 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.695816 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl"] Apr 23 14:09:31.696111 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.696088 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" Apr 23 14:09:31.696111 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.696110 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" Apr 23 14:09:31.696310 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.696126 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="storage-initializer" Apr 23 14:09:31.696310 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.696134 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="storage-initializer" Apr 23 14:09:31.696310 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.696158 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kube-rbac-proxy" Apr 23 14:09:31.696310 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.696165 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kube-rbac-proxy" Apr 23 14:09:31.696310 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.696239 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kserve-container" Apr 23 14:09:31.696310 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.696252 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f627ee1-0d9f-4eb5-8e78-b6b56b7e8773" containerName="kube-rbac-proxy" Apr 23 14:09:31.699098 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.699083 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.701688 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.701670 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 23 14:09:31.701799 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.701671 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 23 14:09:31.707480 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.707458 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl"] Apr 23 14:09:31.793874 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.793843 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c963bbe2-db67-4497-89a4-dd7fd00895ab-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.794065 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.793886 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c963bbe2-db67-4497-89a4-dd7fd00895ab-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.794065 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.793969 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4prbs\" (UniqueName: \"kubernetes.io/projected/c963bbe2-db67-4497-89a4-dd7fd00895ab-kube-api-access-4prbs\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.794065 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.794047 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.894599 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.894566 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c963bbe2-db67-4497-89a4-dd7fd00895ab-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.894782 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.894613 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c963bbe2-db67-4497-89a4-dd7fd00895ab-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.894782 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.894742 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4prbs\" (UniqueName: \"kubernetes.io/projected/c963bbe2-db67-4497-89a4-dd7fd00895ab-kube-api-access-4prbs\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.894941 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.894787 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.895013 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:09:31.894975 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-serving-cert: secret "isvc-predictive-xgboost-v2-predictor-serving-cert" not found Apr 23 14:09:31.895070 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:09:31.895052 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls podName:c963bbe2-db67-4497-89a4-dd7fd00895ab nodeName:}" failed. No retries permitted until 2026-04-23 14:09:32.395030787 +0000 UTC m=+2275.382271651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls") pod "isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" (UID: "c963bbe2-db67-4497-89a4-dd7fd00895ab") : secret "isvc-predictive-xgboost-v2-predictor-serving-cert" not found Apr 23 14:09:31.895129 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.895083 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c963bbe2-db67-4497-89a4-dd7fd00895ab-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.895394 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.895376 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c963bbe2-db67-4497-89a4-dd7fd00895ab-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:31.903460 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:31.903435 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4prbs\" (UniqueName: \"kubernetes.io/projected/c963bbe2-db67-4497-89a4-dd7fd00895ab-kube-api-access-4prbs\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:32.349298 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:32.349223 2582 generic.go:358] "Generic (PLEG): container finished" podID="cb08a251-9ca6-4799-90f3-54f489119044" containerID="a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8" exitCode=2 Apr 23 14:09:32.349435 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:32.349291 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" event={"ID":"cb08a251-9ca6-4799-90f3-54f489119044","Type":"ContainerDied","Data":"a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8"} Apr 23 14:09:32.399718 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:32.399687 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:32.402213 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:32.402196 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:32.609250 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:32.609175 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:32.729124 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:32.729093 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl"] Apr 23 14:09:32.732089 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:09:32.732057 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc963bbe2_db67_4497_89a4_dd7fd00895ab.slice/crio-604c6b52b2bea977d7fdaf2e007bcf43537f66dfa970f3ff2e3fba20ca8200d3 WatchSource:0}: Error finding container 604c6b52b2bea977d7fdaf2e007bcf43537f66dfa970f3ff2e3fba20ca8200d3: Status 404 returned error can't find the container with id 604c6b52b2bea977d7fdaf2e007bcf43537f66dfa970f3ff2e3fba20ca8200d3 Apr 23 14:09:32.733960 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:32.733940 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:09:33.355964 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:33.355908 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" event={"ID":"c963bbe2-db67-4497-89a4-dd7fd00895ab","Type":"ContainerStarted","Data":"2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0"} Apr 23 14:09:33.355964 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:33.355966 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" event={"ID":"c963bbe2-db67-4497-89a4-dd7fd00895ab","Type":"ContainerStarted","Data":"604c6b52b2bea977d7fdaf2e007bcf43537f66dfa970f3ff2e3fba20ca8200d3"} Apr 23 14:09:34.121980 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:34.121937 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 23 14:09:34.128541 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:34.128518 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 14:09:36.327035 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.327012 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:09:36.365932 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.365892 2582 generic.go:358] "Generic (PLEG): container finished" podID="cb08a251-9ca6-4799-90f3-54f489119044" containerID="66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160" exitCode=0 Apr 23 14:09:36.366104 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.365980 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" event={"ID":"cb08a251-9ca6-4799-90f3-54f489119044","Type":"ContainerDied","Data":"66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160"} Apr 23 14:09:36.366104 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.366002 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" Apr 23 14:09:36.366104 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.366033 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j" event={"ID":"cb08a251-9ca6-4799-90f3-54f489119044","Type":"ContainerDied","Data":"fa05c404f172aeaff8e544a7e5eb053ae9d340586849f370ec744b7a2893dca3"} Apr 23 14:09:36.366104 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.366058 2582 scope.go:117] "RemoveContainer" containerID="a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8" Apr 23 14:09:36.374236 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.374212 2582 scope.go:117] "RemoveContainer" containerID="66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160" Apr 23 14:09:36.433953 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.433871 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl2rr\" (UniqueName: \"kubernetes.io/projected/cb08a251-9ca6-4799-90f3-54f489119044-kube-api-access-wl2rr\") pod \"cb08a251-9ca6-4799-90f3-54f489119044\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " Apr 23 14:09:36.434092 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.433967 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb08a251-9ca6-4799-90f3-54f489119044-proxy-tls\") pod \"cb08a251-9ca6-4799-90f3-54f489119044\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " Apr 23 14:09:36.434092 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.434000 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb08a251-9ca6-4799-90f3-54f489119044-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"cb08a251-9ca6-4799-90f3-54f489119044\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " Apr 23 14:09:36.434092 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.434033 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08a251-9ca6-4799-90f3-54f489119044-kserve-provision-location\") pod \"cb08a251-9ca6-4799-90f3-54f489119044\" (UID: \"cb08a251-9ca6-4799-90f3-54f489119044\") " Apr 23 14:09:36.434387 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.434354 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb08a251-9ca6-4799-90f3-54f489119044-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "cb08a251-9ca6-4799-90f3-54f489119044" (UID: "cb08a251-9ca6-4799-90f3-54f489119044"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:09:36.434455 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.434411 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08a251-9ca6-4799-90f3-54f489119044-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cb08a251-9ca6-4799-90f3-54f489119044" (UID: "cb08a251-9ca6-4799-90f3-54f489119044"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:09:36.436132 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.436108 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb08a251-9ca6-4799-90f3-54f489119044-kube-api-access-wl2rr" (OuterVolumeSpecName: "kube-api-access-wl2rr") pod "cb08a251-9ca6-4799-90f3-54f489119044" (UID: "cb08a251-9ca6-4799-90f3-54f489119044"). InnerVolumeSpecName "kube-api-access-wl2rr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:09:36.436533 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.436515 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb08a251-9ca6-4799-90f3-54f489119044-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cb08a251-9ca6-4799-90f3-54f489119044" (UID: "cb08a251-9ca6-4799-90f3-54f489119044"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:09:36.436635 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.436619 2582 scope.go:117] "RemoveContainer" containerID="638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4" Apr 23 14:09:36.444730 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.444713 2582 scope.go:117] "RemoveContainer" containerID="a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8" Apr 23 14:09:36.445038 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:09:36.445012 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8\": container with ID starting with a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8 not found: ID does not exist" containerID="a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8" Apr 23 14:09:36.445094 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.445040 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8"} err="failed to get container status \"a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8\": rpc error: code = NotFound desc = could not find container \"a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8\": container with ID starting with a284d8c941aed732af5a3ea28bb7f511781cf7a5cbeeb75fed42ed63b0e0afd8 not found: ID does not exist" Apr 23 14:09:36.445094 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.445059 2582 scope.go:117] "RemoveContainer" containerID="66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160" Apr 23 14:09:36.445309 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:09:36.445295 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160\": container with ID starting with 66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160 not found: ID does not exist" containerID="66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160" Apr 23 14:09:36.445350 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.445314 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160"} err="failed to get container status \"66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160\": rpc error: code = NotFound desc = could not find container \"66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160\": container with ID starting with 66e088cf9cea3cc47c4b5030001ab4267d9770bca4d4ada2dc7094056a592160 not found: ID does not exist" Apr 23 14:09:36.445350 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.445327 2582 scope.go:117] "RemoveContainer" containerID="638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4" Apr 23 14:09:36.445571 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:09:36.445552 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4\": container with ID starting with 638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4 not found: ID does not exist" containerID="638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4" Apr 23 14:09:36.445620 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.445577 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4"} err="failed to get container status \"638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4\": rpc error: code = NotFound desc = could not find container \"638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4\": container with ID starting with 638c07302be74f4b26260560771d6fd15f22102f9458636274c9d2fdec8c2de4 not found: ID does not exist" Apr 23 14:09:36.534810 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.534781 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wl2rr\" (UniqueName: \"kubernetes.io/projected/cb08a251-9ca6-4799-90f3-54f489119044-kube-api-access-wl2rr\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:09:36.534810 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.534807 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb08a251-9ca6-4799-90f3-54f489119044-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:09:36.534810 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.534818 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb08a251-9ca6-4799-90f3-54f489119044-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:09:36.535060 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.534827 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08a251-9ca6-4799-90f3-54f489119044-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:09:36.687523 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.687453 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j"] Apr 23 14:09:36.691678 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:36.691651 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-9vt2j"] Apr 23 14:09:37.371335 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:37.371299 2582 generic.go:358] "Generic (PLEG): container finished" podID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerID="2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0" exitCode=0 Apr 23 14:09:37.371786 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:37.371372 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" event={"ID":"c963bbe2-db67-4497-89a4-dd7fd00895ab","Type":"ContainerDied","Data":"2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0"} Apr 23 14:09:37.669496 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:37.669460 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb08a251-9ca6-4799-90f3-54f489119044" path="/var/lib/kubelet/pods/cb08a251-9ca6-4799-90f3-54f489119044/volumes" Apr 23 14:09:38.376456 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:38.376413 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" event={"ID":"c963bbe2-db67-4497-89a4-dd7fd00895ab","Type":"ContainerStarted","Data":"d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0"} Apr 23 14:09:38.376456 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:38.376451 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" event={"ID":"c963bbe2-db67-4497-89a4-dd7fd00895ab","Type":"ContainerStarted","Data":"6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c"} Apr 23 14:09:38.376872 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:38.376701 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:38.376872 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:38.376764 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:09:38.399732 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:38.399687 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podStartSLOduration=7.399673485 podStartE2EDuration="7.399673485s" podCreationTimestamp="2026-04-23 14:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:09:38.399073793 +0000 UTC m=+2281.386314684" watchObservedRunningTime="2026-04-23 14:09:38.399673485 +0000 UTC m=+2281.386914363" Apr 23 14:09:44.385172 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:09:44.385141 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:10:14.385724 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:10:14.385686 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 14:10:24.386331 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:10:24.386246 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 14:10:34.385804 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:10:34.385768 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 14:10:44.385969 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:10:44.385903 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 14:10:54.389490 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:10:54.389457 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:11:01.810738 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.810707 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl"] Apr 23 14:11:01.811270 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.811059 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" containerID="cri-o://6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c" gracePeriod=30 Apr 23 14:11:01.811270 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.811092 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kube-rbac-proxy" containerID="cri-o://d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0" gracePeriod=30 Apr 23 14:11:01.921588 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921555 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm"] Apr 23 14:11:01.921846 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921831 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="storage-initializer" Apr 23 14:11:01.921846 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921847 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="storage-initializer" Apr 23 14:11:01.922008 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921856 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" Apr 23 14:11:01.922008 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921862 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" Apr 23 14:11:01.922008 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921872 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kube-rbac-proxy" Apr 23 14:11:01.922008 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921885 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kube-rbac-proxy" Apr 23 14:11:01.922008 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921966 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kube-rbac-proxy" Apr 23 14:11:01.922008 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.921975 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb08a251-9ca6-4799-90f3-54f489119044" containerName="kserve-container" Apr 23 14:11:01.925113 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.925094 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:01.927853 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.927834 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 23 14:11:01.927994 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.927873 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 23 14:11:01.935621 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.935602 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm"] Apr 23 14:11:01.990799 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.990763 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:01.990985 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.990811 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpq6c\" (UniqueName: \"kubernetes.io/projected/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kube-api-access-wpq6c\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:01.990985 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.990849 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:01.990985 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:01.990876 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.091779 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.091693 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.091779 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.091752 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.091999 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.091795 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.091999 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.091820 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpq6c\" (UniqueName: \"kubernetes.io/projected/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kube-api-access-wpq6c\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.092370 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.092343 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.092563 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.092544 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.094276 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.094252 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.101368 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.101349 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpq6c\" (UniqueName: \"kubernetes.io/projected/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kube-api-access-wpq6c\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.237126 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.237094 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:02.361756 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.361684 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm"] Apr 23 14:11:02.364902 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:11:02.364878 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9bdcbad_1b46_4062_a142_b2f9f6a0fca9.slice/crio-13e83a7cdf97e3679288356b34b85beb947dc695da846336f61b20d79c1ea6a8 WatchSource:0}: Error finding container 13e83a7cdf97e3679288356b34b85beb947dc695da846336f61b20d79c1ea6a8: Status 404 returned error can't find the container with id 13e83a7cdf97e3679288356b34b85beb947dc695da846336f61b20d79c1ea6a8 Apr 23 14:11:02.607556 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.607516 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" event={"ID":"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9","Type":"ContainerStarted","Data":"bb22c5ad85871eeade9615ad7653682078491a711f57332f06b2f829eec640bf"} Apr 23 14:11:02.607556 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.607559 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" event={"ID":"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9","Type":"ContainerStarted","Data":"13e83a7cdf97e3679288356b34b85beb947dc695da846336f61b20d79c1ea6a8"} Apr 23 14:11:02.609455 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.609431 2582 generic.go:358] "Generic (PLEG): container finished" podID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerID="d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0" exitCode=2 Apr 23 14:11:02.609546 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:02.609502 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" event={"ID":"c963bbe2-db67-4497-89a4-dd7fd00895ab","Type":"ContainerDied","Data":"d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0"} Apr 23 14:11:04.380578 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:04.380536 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 23 14:11:04.386036 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:04.386006 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 14:11:06.447770 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.447748 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:11:06.528033 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.527952 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c963bbe2-db67-4497-89a4-dd7fd00895ab-kserve-provision-location\") pod \"c963bbe2-db67-4497-89a4-dd7fd00895ab\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " Apr 23 14:11:06.528033 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.527997 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls\") pod \"c963bbe2-db67-4497-89a4-dd7fd00895ab\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " Apr 23 14:11:06.528252 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.528054 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4prbs\" (UniqueName: \"kubernetes.io/projected/c963bbe2-db67-4497-89a4-dd7fd00895ab-kube-api-access-4prbs\") pod \"c963bbe2-db67-4497-89a4-dd7fd00895ab\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " Apr 23 14:11:06.528252 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.528083 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c963bbe2-db67-4497-89a4-dd7fd00895ab-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"c963bbe2-db67-4497-89a4-dd7fd00895ab\" (UID: \"c963bbe2-db67-4497-89a4-dd7fd00895ab\") " Apr 23 14:11:06.528368 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.528286 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c963bbe2-db67-4497-89a4-dd7fd00895ab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c963bbe2-db67-4497-89a4-dd7fd00895ab" (UID: "c963bbe2-db67-4497-89a4-dd7fd00895ab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:11:06.528468 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.528448 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c963bbe2-db67-4497-89a4-dd7fd00895ab-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "c963bbe2-db67-4497-89a4-dd7fd00895ab" (UID: "c963bbe2-db67-4497-89a4-dd7fd00895ab"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:11:06.530331 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.530299 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c963bbe2-db67-4497-89a4-dd7fd00895ab" (UID: "c963bbe2-db67-4497-89a4-dd7fd00895ab"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:11:06.530431 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.530384 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c963bbe2-db67-4497-89a4-dd7fd00895ab-kube-api-access-4prbs" (OuterVolumeSpecName: "kube-api-access-4prbs") pod "c963bbe2-db67-4497-89a4-dd7fd00895ab" (UID: "c963bbe2-db67-4497-89a4-dd7fd00895ab"). InnerVolumeSpecName "kube-api-access-4prbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:11:06.624396 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.624362 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerID="bb22c5ad85871eeade9615ad7653682078491a711f57332f06b2f829eec640bf" exitCode=0 Apr 23 14:11:06.624559 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.624445 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" event={"ID":"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9","Type":"ContainerDied","Data":"bb22c5ad85871eeade9615ad7653682078491a711f57332f06b2f829eec640bf"} Apr 23 14:11:06.626258 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.626235 2582 generic.go:358] "Generic (PLEG): container finished" podID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerID="6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c" exitCode=0 Apr 23 14:11:06.626357 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.626299 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" Apr 23 14:11:06.626357 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.626319 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" event={"ID":"c963bbe2-db67-4497-89a4-dd7fd00895ab","Type":"ContainerDied","Data":"6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c"} Apr 23 14:11:06.626448 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.626356 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl" event={"ID":"c963bbe2-db67-4497-89a4-dd7fd00895ab","Type":"ContainerDied","Data":"604c6b52b2bea977d7fdaf2e007bcf43537f66dfa970f3ff2e3fba20ca8200d3"} Apr 23 14:11:06.626448 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.626373 2582 scope.go:117] "RemoveContainer" containerID="d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0" Apr 23 14:11:06.628667 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.628644 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4prbs\" (UniqueName: \"kubernetes.io/projected/c963bbe2-db67-4497-89a4-dd7fd00895ab-kube-api-access-4prbs\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:11:06.628780 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.628675 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c963bbe2-db67-4497-89a4-dd7fd00895ab-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:11:06.628780 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.628692 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c963bbe2-db67-4497-89a4-dd7fd00895ab-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:11:06.628780 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.628706 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c963bbe2-db67-4497-89a4-dd7fd00895ab-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:11:06.637070 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.637054 2582 scope.go:117] "RemoveContainer" containerID="6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c" Apr 23 14:11:06.644836 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.644810 2582 scope.go:117] "RemoveContainer" containerID="2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0" Apr 23 14:11:06.660162 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.660135 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl"] Apr 23 14:11:06.664635 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.664610 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mqtrl"] Apr 23 14:11:06.667236 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.667211 2582 scope.go:117] "RemoveContainer" containerID="d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0" Apr 23 14:11:06.667612 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:11:06.667576 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0\": container with ID starting with d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0 not found: ID does not exist" containerID="d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0" Apr 23 14:11:06.667713 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.667618 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0"} err="failed to get container status \"d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0\": rpc error: code = NotFound desc = could not find container \"d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0\": container with ID starting with d77350cd5037d746a35bacd9c5d00bb660dbc6cdf526eb7c32d6f41d7edf59d0 not found: ID does not exist" Apr 23 14:11:06.667713 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.667639 2582 scope.go:117] "RemoveContainer" containerID="6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c" Apr 23 14:11:06.667996 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:11:06.667973 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c\": container with ID starting with 6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c not found: ID does not exist" containerID="6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c" Apr 23 14:11:06.668067 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.668005 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c"} err="failed to get container status \"6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c\": rpc error: code = NotFound desc = could not find container \"6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c\": container with ID starting with 6c9732502eeca5eabe1182ed8f7ac1131bbce66f66f6e73f9859bcb5bc340d5c not found: ID does not exist" Apr 23 14:11:06.668067 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.668023 2582 scope.go:117] "RemoveContainer" containerID="2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0" Apr 23 14:11:06.668334 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:11:06.668308 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0\": container with ID starting with 2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0 not found: ID does not exist" containerID="2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0" Apr 23 14:11:06.668429 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:06.668335 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0"} err="failed to get container status \"2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0\": rpc error: code = NotFound desc = could not find container \"2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0\": container with ID starting with 2755abf25e3002629334cdff1a525f3c478ed7aba336d6b018959ffc71c197d0 not found: ID does not exist" Apr 23 14:11:07.631415 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:07.631378 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" event={"ID":"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9","Type":"ContainerStarted","Data":"a3be0879b2b82c2bdac8bfc4378400bc794d4d0b2ee37b71bdda5765c845541a"} Apr 23 14:11:07.631970 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:07.631423 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" event={"ID":"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9","Type":"ContainerStarted","Data":"4a2eb1aa3c2912c2bd199c8f9a67cd103b9f5e8fdcad9b06787381e1b73931bd"} Apr 23 14:11:07.631970 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:07.631627 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:07.664388 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:07.664341 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podStartSLOduration=6.6643272289999995 podStartE2EDuration="6.664327229s" podCreationTimestamp="2026-04-23 14:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:11:07.662865656 +0000 UTC m=+2370.650106532" watchObservedRunningTime="2026-04-23 14:11:07.664327229 +0000 UTC m=+2370.651568107" Apr 23 14:11:07.670151 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:07.670126 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" path="/var/lib/kubelet/pods/c963bbe2-db67-4497-89a4-dd7fd00895ab/volumes" Apr 23 14:11:08.635434 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:08.635403 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:14.643761 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:14.643732 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:11:37.676405 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:37.676370 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:11:37.682767 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:37.682747 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:11:44.644294 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:44.644251 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 14:11:54.644415 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:11:54.644321 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 14:12:04.645165 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:04.645118 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 14:12:14.645130 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:14.645087 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 14:12:24.647409 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:24.647375 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:12:32.723219 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:32.723182 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm"] Apr 23 14:12:32.724312 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:32.724239 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" containerID="cri-o://4a2eb1aa3c2912c2bd199c8f9a67cd103b9f5e8fdcad9b06787381e1b73931bd" gracePeriod=30 Apr 23 14:12:32.724458 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:32.724281 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kube-rbac-proxy" containerID="cri-o://a3be0879b2b82c2bdac8bfc4378400bc794d4d0b2ee37b71bdda5765c845541a" gracePeriod=30 Apr 23 14:12:32.855539 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:32.855506 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerID="a3be0879b2b82c2bdac8bfc4378400bc794d4d0b2ee37b71bdda5765c845541a" exitCode=2 Apr 23 14:12:32.855746 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:32.855557 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" event={"ID":"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9","Type":"ContainerDied","Data":"a3be0879b2b82c2bdac8bfc4378400bc794d4d0b2ee37b71bdda5765c845541a"} Apr 23 14:12:34.639034 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:34.638975 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 23 14:12:34.644340 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:34.644313 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 14:12:37.871609 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:37.871574 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerID="4a2eb1aa3c2912c2bd199c8f9a67cd103b9f5e8fdcad9b06787381e1b73931bd" exitCode=0 Apr 23 14:12:37.871966 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:37.871650 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" event={"ID":"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9","Type":"ContainerDied","Data":"4a2eb1aa3c2912c2bd199c8f9a67cd103b9f5e8fdcad9b06787381e1b73931bd"} Apr 23 14:12:38.367908 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.367884 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:12:38.505394 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.505300 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kserve-provision-location\") pod \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " Apr 23 14:12:38.505394 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.505340 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpq6c\" (UniqueName: \"kubernetes.io/projected/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kube-api-access-wpq6c\") pod \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " Apr 23 14:12:38.505394 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.505384 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-proxy-tls\") pod \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " Apr 23 14:12:38.505687 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.505417 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\" (UID: \"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9\") " Apr 23 14:12:38.505687 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.505657 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" (UID: "f9bdcbad-1b46-4062-a142-b2f9f6a0fca9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:12:38.505833 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.505810 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" (UID: "f9bdcbad-1b46-4062-a142-b2f9f6a0fca9"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:12:38.507660 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.507629 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" (UID: "f9bdcbad-1b46-4062-a142-b2f9f6a0fca9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:12:38.507766 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.507674 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kube-api-access-wpq6c" (OuterVolumeSpecName: "kube-api-access-wpq6c") pod "f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" (UID: "f9bdcbad-1b46-4062-a142-b2f9f6a0fca9"). InnerVolumeSpecName "kube-api-access-wpq6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:12:38.606817 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.606780 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:12:38.606817 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.606810 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:12:38.606817 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.606821 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:12:38.607064 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.606831 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wpq6c\" (UniqueName: \"kubernetes.io/projected/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9-kube-api-access-wpq6c\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:12:38.876136 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.876042 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" event={"ID":"f9bdcbad-1b46-4062-a142-b2f9f6a0fca9","Type":"ContainerDied","Data":"13e83a7cdf97e3679288356b34b85beb947dc695da846336f61b20d79c1ea6a8"} Apr 23 14:12:38.876136 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.876097 2582 scope.go:117] "RemoveContainer" containerID="a3be0879b2b82c2bdac8bfc4378400bc794d4d0b2ee37b71bdda5765c845541a" Apr 23 14:12:38.876136 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.876105 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm" Apr 23 14:12:38.884726 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.884707 2582 scope.go:117] "RemoveContainer" containerID="4a2eb1aa3c2912c2bd199c8f9a67cd103b9f5e8fdcad9b06787381e1b73931bd" Apr 23 14:12:38.891738 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.891728 2582 scope.go:117] "RemoveContainer" containerID="bb22c5ad85871eeade9615ad7653682078491a711f57332f06b2f829eec640bf" Apr 23 14:12:38.899998 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.899979 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm"] Apr 23 14:12:38.902150 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:38.902131 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-2hfzm"] Apr 23 14:12:39.669851 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:12:39.669817 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" path="/var/lib/kubelet/pods/f9bdcbad-1b46-4062-a142-b2f9f6a0fca9/volumes" Apr 23 14:13:55.098640 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098608 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh"] Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098902 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098929 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098941 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="storage-initializer" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098949 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="storage-initializer" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098966 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098972 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098980 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kube-rbac-proxy" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098985 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kube-rbac-proxy" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098993 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kube-rbac-proxy" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.098998 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kube-rbac-proxy" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.099004 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="storage-initializer" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.099010 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="storage-initializer" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.099059 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kube-rbac-proxy" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.099068 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kserve-container" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.099074 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9bdcbad-1b46-4062-a142-b2f9f6a0fca9" containerName="kube-rbac-proxy" Apr 23 14:13:55.099131 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.099081 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="c963bbe2-db67-4497-89a4-dd7fd00895ab" containerName="kserve-container" Apr 23 14:13:55.102223 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.102208 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.105043 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.105021 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:13:55.105043 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.105032 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:13:55.105228 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.105085 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 23 14:13:55.106011 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.105994 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:13:55.106074 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.106007 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 14:13:55.113884 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.113864 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh"] Apr 23 14:13:55.148195 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.148169 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d40203-3082-4215-9f0e-6b4b45504953-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.148323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.148203 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3d40203-3082-4215-9f0e-6b4b45504953-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.148323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.148232 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w247\" (UniqueName: \"kubernetes.io/projected/b3d40203-3082-4215-9f0e-6b4b45504953-kube-api-access-4w247\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.148323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.148250 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3d40203-3082-4215-9f0e-6b4b45504953-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.249168 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.249134 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d40203-3082-4215-9f0e-6b4b45504953-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.249316 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.249176 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3d40203-3082-4215-9f0e-6b4b45504953-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.249316 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.249216 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w247\" (UniqueName: \"kubernetes.io/projected/b3d40203-3082-4215-9f0e-6b4b45504953-kube-api-access-4w247\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.249316 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.249245 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3d40203-3082-4215-9f0e-6b4b45504953-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.249547 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.249524 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d40203-3082-4215-9f0e-6b4b45504953-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.249837 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.249818 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3d40203-3082-4215-9f0e-6b4b45504953-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.251842 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.251821 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3d40203-3082-4215-9f0e-6b4b45504953-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.257840 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.257820 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w247\" (UniqueName: \"kubernetes.io/projected/b3d40203-3082-4215-9f0e-6b4b45504953-kube-api-access-4w247\") pod \"sklearn-v2-mlserver-predictor-65d8664766-rfwrh\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.412938 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.412887 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:13:55.542215 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:55.539198 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh"] Apr 23 14:13:55.545229 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:13:55.545194 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d40203_3082_4215_9f0e_6b4b45504953.slice/crio-7ae15ff75e61ec90e091bb1dea65fc670ef0545cbc1dbbb1b641b8ec46d31a06 WatchSource:0}: Error finding container 7ae15ff75e61ec90e091bb1dea65fc670ef0545cbc1dbbb1b641b8ec46d31a06: Status 404 returned error can't find the container with id 7ae15ff75e61ec90e091bb1dea65fc670ef0545cbc1dbbb1b641b8ec46d31a06 Apr 23 14:13:56.081196 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:56.081151 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" event={"ID":"b3d40203-3082-4215-9f0e-6b4b45504953","Type":"ContainerStarted","Data":"8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13"} Apr 23 14:13:56.081196 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:13:56.081198 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" event={"ID":"b3d40203-3082-4215-9f0e-6b4b45504953","Type":"ContainerStarted","Data":"7ae15ff75e61ec90e091bb1dea65fc670ef0545cbc1dbbb1b641b8ec46d31a06"} Apr 23 14:14:00.093188 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:00.093149 2582 generic.go:358] "Generic (PLEG): container finished" podID="b3d40203-3082-4215-9f0e-6b4b45504953" containerID="8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13" exitCode=0 Apr 23 14:14:00.093585 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:00.093218 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" event={"ID":"b3d40203-3082-4215-9f0e-6b4b45504953","Type":"ContainerDied","Data":"8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13"} Apr 23 14:14:01.097368 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:01.097326 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" event={"ID":"b3d40203-3082-4215-9f0e-6b4b45504953","Type":"ContainerStarted","Data":"dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3"} Apr 23 14:14:01.097368 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:01.097368 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" event={"ID":"b3d40203-3082-4215-9f0e-6b4b45504953","Type":"ContainerStarted","Data":"036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e"} Apr 23 14:14:01.097959 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:01.097618 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:14:01.116892 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:01.116842 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" podStartSLOduration=6.11683065 podStartE2EDuration="6.11683065s" podCreationTimestamp="2026-04-23 14:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:14:01.115588683 +0000 UTC m=+2544.102829561" watchObservedRunningTime="2026-04-23 14:14:01.11683065 +0000 UTC m=+2544.104071529" Apr 23 14:14:02.102991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:02.102955 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:14:08.110789 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:08.110760 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:14:38.133266 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:38.133222 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 14:14:48.113666 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:48.113637 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:14:55.186739 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:55.186657 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh"] Apr 23 14:14:55.187240 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:55.187064 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kserve-container" containerID="cri-o://036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e" gracePeriod=30 Apr 23 14:14:55.187240 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:55.187130 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kube-rbac-proxy" containerID="cri-o://dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3" gracePeriod=30 Apr 23 14:14:56.255988 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:56.255950 2582 generic.go:358] "Generic (PLEG): container finished" podID="b3d40203-3082-4215-9f0e-6b4b45504953" containerID="dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3" exitCode=2 Apr 23 14:14:56.255988 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:56.255968 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" event={"ID":"b3d40203-3082-4215-9f0e-6b4b45504953","Type":"ContainerDied","Data":"dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3"} Apr 23 14:14:58.105622 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:14:58.105577 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.41:8643/healthz\": dial tcp 10.133.0.41:8643: connect: connection refused" Apr 23 14:15:01.825176 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.825150 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:15:01.946941 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.946840 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d40203-3082-4215-9f0e-6b4b45504953-kserve-provision-location\") pod \"b3d40203-3082-4215-9f0e-6b4b45504953\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " Apr 23 14:15:01.946941 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.946903 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3d40203-3082-4215-9f0e-6b4b45504953-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"b3d40203-3082-4215-9f0e-6b4b45504953\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " Apr 23 14:15:01.947185 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.946952 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3d40203-3082-4215-9f0e-6b4b45504953-proxy-tls\") pod \"b3d40203-3082-4215-9f0e-6b4b45504953\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " Apr 23 14:15:01.947185 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.947068 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w247\" (UniqueName: \"kubernetes.io/projected/b3d40203-3082-4215-9f0e-6b4b45504953-kube-api-access-4w247\") pod \"b3d40203-3082-4215-9f0e-6b4b45504953\" (UID: \"b3d40203-3082-4215-9f0e-6b4b45504953\") " Apr 23 14:15:01.947185 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.947160 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d40203-3082-4215-9f0e-6b4b45504953-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b3d40203-3082-4215-9f0e-6b4b45504953" (UID: "b3d40203-3082-4215-9f0e-6b4b45504953"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:15:01.947311 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.947257 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d40203-3082-4215-9f0e-6b4b45504953-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "b3d40203-3082-4215-9f0e-6b4b45504953" (UID: "b3d40203-3082-4215-9f0e-6b4b45504953"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:15:01.947311 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.947291 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d40203-3082-4215-9f0e-6b4b45504953-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:15:01.949051 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.949031 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d40203-3082-4215-9f0e-6b4b45504953-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b3d40203-3082-4215-9f0e-6b4b45504953" (UID: "b3d40203-3082-4215-9f0e-6b4b45504953"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:15:01.949156 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:01.949133 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d40203-3082-4215-9f0e-6b4b45504953-kube-api-access-4w247" (OuterVolumeSpecName: "kube-api-access-4w247") pod "b3d40203-3082-4215-9f0e-6b4b45504953" (UID: "b3d40203-3082-4215-9f0e-6b4b45504953"). InnerVolumeSpecName "kube-api-access-4w247". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:15:02.048247 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.048205 2582 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3d40203-3082-4215-9f0e-6b4b45504953-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:15:02.048247 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.048242 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3d40203-3082-4215-9f0e-6b4b45504953-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:15:02.048247 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.048253 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4w247\" (UniqueName: \"kubernetes.io/projected/b3d40203-3082-4215-9f0e-6b4b45504953-kube-api-access-4w247\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:15:02.274369 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.274280 2582 generic.go:358] "Generic (PLEG): container finished" podID="b3d40203-3082-4215-9f0e-6b4b45504953" containerID="036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e" exitCode=0 Apr 23 14:15:02.274369 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.274344 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" event={"ID":"b3d40203-3082-4215-9f0e-6b4b45504953","Type":"ContainerDied","Data":"036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e"} Apr 23 14:15:02.274569 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.274395 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" event={"ID":"b3d40203-3082-4215-9f0e-6b4b45504953","Type":"ContainerDied","Data":"7ae15ff75e61ec90e091bb1dea65fc670ef0545cbc1dbbb1b641b8ec46d31a06"} Apr 23 14:15:02.274569 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.274411 2582 scope.go:117] "RemoveContainer" containerID="dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3" Apr 23 14:15:02.274569 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.274399 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh" Apr 23 14:15:02.282184 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.282056 2582 scope.go:117] "RemoveContainer" containerID="036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e" Apr 23 14:15:02.289413 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.289395 2582 scope.go:117] "RemoveContainer" containerID="8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13" Apr 23 14:15:02.296235 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.296194 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh"] Apr 23 14:15:02.296414 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.296398 2582 scope.go:117] "RemoveContainer" containerID="dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3" Apr 23 14:15:02.296680 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:15:02.296660 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3\": container with ID starting with dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3 not found: ID does not exist" containerID="dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3" Apr 23 14:15:02.296734 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.296688 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3"} err="failed to get container status \"dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3\": rpc error: code = NotFound desc = could not find container \"dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3\": container with ID starting with dec1f11af2903cdd7090427256de61ce718fa2155dee7f4e9dfb014484f927d3 not found: ID does not exist" Apr 23 14:15:02.296734 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.296708 2582 scope.go:117] "RemoveContainer" containerID="036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e" Apr 23 14:15:02.296967 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:15:02.296946 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e\": container with ID starting with 036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e not found: ID does not exist" containerID="036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e" Apr 23 14:15:02.297023 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.296974 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e"} err="failed to get container status \"036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e\": rpc error: code = NotFound desc = could not find container \"036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e\": container with ID starting with 036d4dd95e77aa79707c603ccdd82522f9ecf8175c4d0a68e1bc94453374871e not found: ID does not exist" Apr 23 14:15:02.297023 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.296994 2582 scope.go:117] "RemoveContainer" containerID="8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13" Apr 23 14:15:02.297247 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:15:02.297223 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13\": container with ID starting with 8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13 not found: ID does not exist" containerID="8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13" Apr 23 14:15:02.297297 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.297256 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13"} err="failed to get container status \"8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13\": rpc error: code = NotFound desc = could not find container \"8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13\": container with ID starting with 8efd13399db410dc23bf887b4d0f8591e0dd557a8c70b7c73d414df8d51b7c13 not found: ID does not exist" Apr 23 14:15:02.300203 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:02.300181 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-rfwrh"] Apr 23 14:15:03.669289 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:03.669257 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" path="/var/lib/kubelet/pods/b3d40203-3082-4215-9f0e-6b4b45504953/volumes" Apr 23 14:15:32.465574 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465543 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs"] Apr 23 14:15:32.466048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465789 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kube-rbac-proxy" Apr 23 14:15:32.466048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465800 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kube-rbac-proxy" Apr 23 14:15:32.466048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465810 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="storage-initializer" Apr 23 14:15:32.466048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465816 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="storage-initializer" Apr 23 14:15:32.466048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465826 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kserve-container" Apr 23 14:15:32.466048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465832 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kserve-container" Apr 23 14:15:32.466048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465879 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kube-rbac-proxy" Apr 23 14:15:32.466048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.465889 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3d40203-3082-4215-9f0e-6b4b45504953" containerName="kserve-container" Apr 23 14:15:32.468888 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.468873 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.471195 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.471175 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:15:32.472332 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.472317 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:15:32.472597 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.472582 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 23 14:15:32.473487 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.473466 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:15:32.473487 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.473480 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 14:15:32.486696 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.486670 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs"] Apr 23 14:15:32.560580 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.560549 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/340ec98a-8ed4-4477-9246-eb28c5860dd3-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.560704 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.560583 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/340ec98a-8ed4-4477-9246-eb28c5860dd3-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.560704 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.560624 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/340ec98a-8ed4-4477-9246-eb28c5860dd3-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.560704 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.560647 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72x97\" (UniqueName: \"kubernetes.io/projected/340ec98a-8ed4-4477-9246-eb28c5860dd3-kube-api-access-72x97\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.661550 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.661504 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72x97\" (UniqueName: \"kubernetes.io/projected/340ec98a-8ed4-4477-9246-eb28c5860dd3-kube-api-access-72x97\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.661727 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.661568 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/340ec98a-8ed4-4477-9246-eb28c5860dd3-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.661727 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.661600 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/340ec98a-8ed4-4477-9246-eb28c5860dd3-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.661727 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.661657 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/340ec98a-8ed4-4477-9246-eb28c5860dd3-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.662110 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.662073 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/340ec98a-8ed4-4477-9246-eb28c5860dd3-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.662295 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.662277 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/340ec98a-8ed4-4477-9246-eb28c5860dd3-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.664064 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.664047 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/340ec98a-8ed4-4477-9246-eb28c5860dd3-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.671282 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.671258 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72x97\" (UniqueName: \"kubernetes.io/projected/340ec98a-8ed4-4477-9246-eb28c5860dd3-kube-api-access-72x97\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.779527 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.779435 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:32.906104 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.906073 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs"] Apr 23 14:15:32.908789 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:15:32.908758 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod340ec98a_8ed4_4477_9246_eb28c5860dd3.slice/crio-218f630d3a5a6ea22e6b22b40198918044be2050ab160744907978fbe4cc940e WatchSource:0}: Error finding container 218f630d3a5a6ea22e6b22b40198918044be2050ab160744907978fbe4cc940e: Status 404 returned error can't find the container with id 218f630d3a5a6ea22e6b22b40198918044be2050ab160744907978fbe4cc940e Apr 23 14:15:32.910666 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:32.910651 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:15:33.367990 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:33.367950 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" event={"ID":"340ec98a-8ed4-4477-9246-eb28c5860dd3","Type":"ContainerStarted","Data":"c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d"} Apr 23 14:15:33.367990 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:33.367997 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" event={"ID":"340ec98a-8ed4-4477-9246-eb28c5860dd3","Type":"ContainerStarted","Data":"218f630d3a5a6ea22e6b22b40198918044be2050ab160744907978fbe4cc940e"} Apr 23 14:15:37.380680 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:37.380639 2582 generic.go:358] "Generic (PLEG): container finished" podID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerID="c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d" exitCode=0 Apr 23 14:15:37.381171 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:37.380722 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" event={"ID":"340ec98a-8ed4-4477-9246-eb28c5860dd3","Type":"ContainerDied","Data":"c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d"} Apr 23 14:15:38.385647 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:38.385607 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" event={"ID":"340ec98a-8ed4-4477-9246-eb28c5860dd3","Type":"ContainerStarted","Data":"c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729"} Apr 23 14:15:38.385647 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:38.385643 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" event={"ID":"340ec98a-8ed4-4477-9246-eb28c5860dd3","Type":"ContainerStarted","Data":"f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5"} Apr 23 14:15:38.386093 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:38.385862 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:38.386093 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:38.385891 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:15:38.406436 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:38.406385 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" podStartSLOduration=6.406368994 podStartE2EDuration="6.406368994s" podCreationTimestamp="2026-04-23 14:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:15:38.404784653 +0000 UTC m=+2641.392025532" watchObservedRunningTime="2026-04-23 14:15:38.406368994 +0000 UTC m=+2641.393609872" Apr 23 14:15:44.393431 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:15:44.393402 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:16:14.433375 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:14.433320 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 14:16:24.396331 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:24.396234 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:16:32.487288 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:32.487229 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs"] Apr 23 14:16:32.487742 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:32.487685 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kserve-container" containerID="cri-o://f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5" gracePeriod=30 Apr 23 14:16:32.488024 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:32.487877 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kube-rbac-proxy" containerID="cri-o://c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729" gracePeriod=30 Apr 23 14:16:33.544342 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:33.544303 2582 generic.go:358] "Generic (PLEG): container finished" podID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerID="c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729" exitCode=2 Apr 23 14:16:33.544731 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:33.544351 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" event={"ID":"340ec98a-8ed4-4477-9246-eb28c5860dd3","Type":"ContainerDied","Data":"c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729"} Apr 23 14:16:34.390208 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:34.390159 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 23 14:16:35.436138 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:35.436085 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 23 14:16:37.694954 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:37.694789 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:16:37.706238 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:37.706212 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:16:39.390067 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:39.390023 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 23 14:16:40.424654 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.424626 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:16:40.499867 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.499773 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/340ec98a-8ed4-4477-9246-eb28c5860dd3-kserve-provision-location\") pod \"340ec98a-8ed4-4477-9246-eb28c5860dd3\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " Apr 23 14:16:40.499867 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.499853 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/340ec98a-8ed4-4477-9246-eb28c5860dd3-proxy-tls\") pod \"340ec98a-8ed4-4477-9246-eb28c5860dd3\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " Apr 23 14:16:40.500149 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.499873 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72x97\" (UniqueName: \"kubernetes.io/projected/340ec98a-8ed4-4477-9246-eb28c5860dd3-kube-api-access-72x97\") pod \"340ec98a-8ed4-4477-9246-eb28c5860dd3\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " Apr 23 14:16:40.500149 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.499897 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/340ec98a-8ed4-4477-9246-eb28c5860dd3-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"340ec98a-8ed4-4477-9246-eb28c5860dd3\" (UID: \"340ec98a-8ed4-4477-9246-eb28c5860dd3\") " Apr 23 14:16:40.500267 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.500165 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/340ec98a-8ed4-4477-9246-eb28c5860dd3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "340ec98a-8ed4-4477-9246-eb28c5860dd3" (UID: "340ec98a-8ed4-4477-9246-eb28c5860dd3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:16:40.500311 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.500269 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/340ec98a-8ed4-4477-9246-eb28c5860dd3-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "340ec98a-8ed4-4477-9246-eb28c5860dd3" (UID: "340ec98a-8ed4-4477-9246-eb28c5860dd3"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:16:40.502152 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.502124 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/340ec98a-8ed4-4477-9246-eb28c5860dd3-kube-api-access-72x97" (OuterVolumeSpecName: "kube-api-access-72x97") pod "340ec98a-8ed4-4477-9246-eb28c5860dd3" (UID: "340ec98a-8ed4-4477-9246-eb28c5860dd3"). InnerVolumeSpecName "kube-api-access-72x97". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:16:40.502263 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.502190 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340ec98a-8ed4-4477-9246-eb28c5860dd3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "340ec98a-8ed4-4477-9246-eb28c5860dd3" (UID: "340ec98a-8ed4-4477-9246-eb28c5860dd3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:16:40.565801 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.565758 2582 generic.go:358] "Generic (PLEG): container finished" podID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerID="f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5" exitCode=0 Apr 23 14:16:40.566009 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.565834 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" event={"ID":"340ec98a-8ed4-4477-9246-eb28c5860dd3","Type":"ContainerDied","Data":"f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5"} Apr 23 14:16:40.566009 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.565859 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" event={"ID":"340ec98a-8ed4-4477-9246-eb28c5860dd3","Type":"ContainerDied","Data":"218f630d3a5a6ea22e6b22b40198918044be2050ab160744907978fbe4cc940e"} Apr 23 14:16:40.566009 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.565874 2582 scope.go:117] "RemoveContainer" containerID="c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729" Apr 23 14:16:40.566009 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.565837 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs" Apr 23 14:16:40.574300 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.574280 2582 scope.go:117] "RemoveContainer" containerID="f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5" Apr 23 14:16:40.581417 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.581384 2582 scope.go:117] "RemoveContainer" containerID="c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d" Apr 23 14:16:40.587876 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.587842 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs"] Apr 23 14:16:40.588857 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.588835 2582 scope.go:117] "RemoveContainer" containerID="c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729" Apr 23 14:16:40.589262 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:16:40.589231 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729\": container with ID starting with c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729 not found: ID does not exist" containerID="c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729" Apr 23 14:16:40.589376 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.589264 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729"} err="failed to get container status \"c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729\": rpc error: code = NotFound desc = could not find container \"c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729\": container with ID starting with c228236d26465b6f8f1c2d414b3192e46d6de913fb68aa232344e99e811bd729 not found: ID does not exist" Apr 23 14:16:40.589376 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.589284 2582 scope.go:117] "RemoveContainer" containerID="f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5" Apr 23 14:16:40.589589 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:16:40.589570 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5\": container with ID starting with f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5 not found: ID does not exist" containerID="f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5" Apr 23 14:16:40.589653 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.589598 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5"} err="failed to get container status \"f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5\": rpc error: code = NotFound desc = could not find container \"f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5\": container with ID starting with f48f612fce14819769d16f8a1b5404137e65858383fd846a7b3b6c13ddfca9c5 not found: ID does not exist" Apr 23 14:16:40.589653 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.589615 2582 scope.go:117] "RemoveContainer" containerID="c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d" Apr 23 14:16:40.589865 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:16:40.589848 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d\": container with ID starting with c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d not found: ID does not exist" containerID="c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d" Apr 23 14:16:40.589949 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.589870 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d"} err="failed to get container status \"c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d\": rpc error: code = NotFound desc = could not find container \"c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d\": container with ID starting with c99ed6bb6a12dc0a2ead5495d1df70cef1b648e36f069f7448a69eaf4765ad3d not found: ID does not exist" Apr 23 14:16:40.590137 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.590119 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-hnnbs"] Apr 23 14:16:40.601279 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.601252 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/340ec98a-8ed4-4477-9246-eb28c5860dd3-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:16:40.601279 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.601275 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72x97\" (UniqueName: \"kubernetes.io/projected/340ec98a-8ed4-4477-9246-eb28c5860dd3-kube-api-access-72x97\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:16:40.601418 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.601286 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/340ec98a-8ed4-4477-9246-eb28c5860dd3-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:16:40.601418 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:40.601297 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/340ec98a-8ed4-4477-9246-eb28c5860dd3-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:16:41.670249 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:16:41.670215 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" path="/var/lib/kubelet/pods/340ec98a-8ed4-4477-9246-eb28c5860dd3/volumes" Apr 23 14:19:13.016757 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.016719 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx"] Apr 23 14:19:13.018991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.017097 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kube-rbac-proxy" Apr 23 14:19:13.018991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.017113 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kube-rbac-proxy" Apr 23 14:19:13.018991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.017140 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="storage-initializer" Apr 23 14:19:13.018991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.017150 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="storage-initializer" Apr 23 14:19:13.018991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.017159 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kserve-container" Apr 23 14:19:13.018991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.017168 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kserve-container" Apr 23 14:19:13.018991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.017227 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kube-rbac-proxy" Apr 23 14:19:13.018991 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.017238 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="340ec98a-8ed4-4477-9246-eb28c5860dd3" containerName="kserve-container" Apr 23 14:19:13.020280 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.020262 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.023119 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.023098 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 23 14:19:13.023235 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.023152 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:19:13.024331 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.024305 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:19:13.024463 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.024359 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:19:13.024463 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.024362 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 23 14:19:13.031297 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.031273 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx"] Apr 23 14:19:13.135837 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.135797 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84f989c1-bda2-4b0c-a070-73b2762f7771-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.136055 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.135856 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqgv5\" (UniqueName: \"kubernetes.io/projected/84f989c1-bda2-4b0c-a070-73b2762f7771-kube-api-access-xqgv5\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.136055 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.135907 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.136055 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.136011 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84f989c1-bda2-4b0c-a070-73b2762f7771-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.237468 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.237423 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.237675 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.237500 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84f989c1-bda2-4b0c-a070-73b2762f7771-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.237675 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.237537 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84f989c1-bda2-4b0c-a070-73b2762f7771-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.237675 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.237579 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqgv5\" (UniqueName: \"kubernetes.io/projected/84f989c1-bda2-4b0c-a070-73b2762f7771-kube-api-access-xqgv5\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.237675 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:19:13.237588 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-predictor-serving-cert: secret "isvc-tensorflow-predictor-serving-cert" not found Apr 23 14:19:13.237897 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:19:13.237679 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls podName:84f989c1-bda2-4b0c-a070-73b2762f7771 nodeName:}" failed. No retries permitted until 2026-04-23 14:19:13.737656455 +0000 UTC m=+2856.724897313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls") pod "isvc-tensorflow-predictor-6756f669d7-b4bhx" (UID: "84f989c1-bda2-4b0c-a070-73b2762f7771") : secret "isvc-tensorflow-predictor-serving-cert" not found Apr 23 14:19:13.238201 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.238176 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84f989c1-bda2-4b0c-a070-73b2762f7771-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.238450 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.238426 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84f989c1-bda2-4b0c-a070-73b2762f7771-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.248566 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.248546 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqgv5\" (UniqueName: \"kubernetes.io/projected/84f989c1-bda2-4b0c-a070-73b2762f7771-kube-api-access-xqgv5\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.742163 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:13.742114 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:13.742352 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:19:13.742253 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-predictor-serving-cert: secret "isvc-tensorflow-predictor-serving-cert" not found Apr 23 14:19:13.742352 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:19:13.742324 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls podName:84f989c1-bda2-4b0c-a070-73b2762f7771 nodeName:}" failed. No retries permitted until 2026-04-23 14:19:14.742310168 +0000 UTC m=+2857.729551025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls") pod "isvc-tensorflow-predictor-6756f669d7-b4bhx" (UID: "84f989c1-bda2-4b0c-a070-73b2762f7771") : secret "isvc-tensorflow-predictor-serving-cert" not found Apr 23 14:19:14.750298 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:14.750241 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:14.753038 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:14.753012 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-b4bhx\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:14.833071 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:14.833026 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:14.962427 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:14.962398 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx"] Apr 23 14:19:14.964935 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:19:14.964886 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84f989c1_bda2_4b0c_a070_73b2762f7771.slice/crio-3e3c152e34432629aacd29867b5dabfa42a9033cf17534320a2f64fe4d26a524 WatchSource:0}: Error finding container 3e3c152e34432629aacd29867b5dabfa42a9033cf17534320a2f64fe4d26a524: Status 404 returned error can't find the container with id 3e3c152e34432629aacd29867b5dabfa42a9033cf17534320a2f64fe4d26a524 Apr 23 14:19:14.987587 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:14.987557 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" event={"ID":"84f989c1-bda2-4b0c-a070-73b2762f7771","Type":"ContainerStarted","Data":"3e3c152e34432629aacd29867b5dabfa42a9033cf17534320a2f64fe4d26a524"} Apr 23 14:19:15.992623 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:15.992577 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" event={"ID":"84f989c1-bda2-4b0c-a070-73b2762f7771","Type":"ContainerStarted","Data":"c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e"} Apr 23 14:19:21.007023 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:21.006984 2582 generic.go:358] "Generic (PLEG): container finished" podID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerID="c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e" exitCode=0 Apr 23 14:19:21.007387 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:21.007059 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" event={"ID":"84f989c1-bda2-4b0c-a070-73b2762f7771","Type":"ContainerDied","Data":"c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e"} Apr 23 14:19:25.020992 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:25.020957 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" event={"ID":"84f989c1-bda2-4b0c-a070-73b2762f7771","Type":"ContainerStarted","Data":"1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2"} Apr 23 14:19:26.025676 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:26.025636 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" event={"ID":"84f989c1-bda2-4b0c-a070-73b2762f7771","Type":"ContainerStarted","Data":"e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991"} Apr 23 14:19:26.026209 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:26.025854 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:26.045344 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:26.045288 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podStartSLOduration=10.129409235 podStartE2EDuration="14.045272072s" podCreationTimestamp="2026-04-23 14:19:12 +0000 UTC" firstStartedPulling="2026-04-23 14:19:21.010412828 +0000 UTC m=+2863.997653685" lastFinishedPulling="2026-04-23 14:19:24.926275661 +0000 UTC m=+2867.913516522" observedRunningTime="2026-04-23 14:19:26.044171638 +0000 UTC m=+2869.031412517" watchObservedRunningTime="2026-04-23 14:19:26.045272072 +0000 UTC m=+2869.032512950" Apr 23 14:19:27.028265 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:27.028236 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:27.029548 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:27.029520 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 14:19:28.031472 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:28.031433 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 14:19:33.036071 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:33.036039 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:19:33.036628 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:33.036599 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 14:19:43.037713 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:19:43.037676 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:20:04.277554 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:04.277523 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx"] Apr 23 14:20:04.278159 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:04.277958 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kserve-container" containerID="cri-o://1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2" gracePeriod=30 Apr 23 14:20:04.278159 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:04.278026 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" containerID="cri-o://e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991" gracePeriod=30 Apr 23 14:20:05.139778 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:05.139741 2582 generic.go:358] "Generic (PLEG): container finished" podID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerID="e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991" exitCode=2 Apr 23 14:20:05.139980 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:05.139812 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" event={"ID":"84f989c1-bda2-4b0c-a070-73b2762f7771","Type":"ContainerDied","Data":"e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991"} Apr 23 14:20:08.031786 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:08.031746 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 14:20:13.032228 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:13.032179 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 14:20:18.032514 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:18.032465 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 14:20:18.032899 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:18.032585 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:20:23.032463 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:23.032418 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 14:20:28.032595 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:28.032553 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 14:20:33.032330 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:33.032292 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 14:20:34.915885 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:34.915858 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:20:35.040180 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.040143 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls\") pod \"84f989c1-bda2-4b0c-a070-73b2762f7771\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " Apr 23 14:20:35.040180 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.040184 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84f989c1-bda2-4b0c-a070-73b2762f7771-kserve-provision-location\") pod \"84f989c1-bda2-4b0c-a070-73b2762f7771\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " Apr 23 14:20:35.040440 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.040237 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84f989c1-bda2-4b0c-a070-73b2762f7771-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"84f989c1-bda2-4b0c-a070-73b2762f7771\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " Apr 23 14:20:35.040440 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.040274 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqgv5\" (UniqueName: \"kubernetes.io/projected/84f989c1-bda2-4b0c-a070-73b2762f7771-kube-api-access-xqgv5\") pod \"84f989c1-bda2-4b0c-a070-73b2762f7771\" (UID: \"84f989c1-bda2-4b0c-a070-73b2762f7771\") " Apr 23 14:20:35.040658 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.040622 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f989c1-bda2-4b0c-a070-73b2762f7771-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "84f989c1-bda2-4b0c-a070-73b2762f7771" (UID: "84f989c1-bda2-4b0c-a070-73b2762f7771"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:20:35.042616 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.042584 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f989c1-bda2-4b0c-a070-73b2762f7771-kube-api-access-xqgv5" (OuterVolumeSpecName: "kube-api-access-xqgv5") pod "84f989c1-bda2-4b0c-a070-73b2762f7771" (UID: "84f989c1-bda2-4b0c-a070-73b2762f7771"). InnerVolumeSpecName "kube-api-access-xqgv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:20:35.042750 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.042627 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "84f989c1-bda2-4b0c-a070-73b2762f7771" (UID: "84f989c1-bda2-4b0c-a070-73b2762f7771"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:20:35.051746 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.051714 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84f989c1-bda2-4b0c-a070-73b2762f7771-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84f989c1-bda2-4b0c-a070-73b2762f7771" (UID: "84f989c1-bda2-4b0c-a070-73b2762f7771"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:20:35.141181 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.141142 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84f989c1-bda2-4b0c-a070-73b2762f7771-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:20:35.141181 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.141176 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqgv5\" (UniqueName: \"kubernetes.io/projected/84f989c1-bda2-4b0c-a070-73b2762f7771-kube-api-access-xqgv5\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:20:35.141181 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.141188 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84f989c1-bda2-4b0c-a070-73b2762f7771-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:20:35.141371 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.141199 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84f989c1-bda2-4b0c-a070-73b2762f7771-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:20:35.218133 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.218096 2582 generic.go:358] "Generic (PLEG): container finished" podID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerID="1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2" exitCode=137 Apr 23 14:20:35.218270 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.218179 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" Apr 23 14:20:35.218270 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.218194 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" event={"ID":"84f989c1-bda2-4b0c-a070-73b2762f7771","Type":"ContainerDied","Data":"1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2"} Apr 23 14:20:35.218270 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.218231 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx" event={"ID":"84f989c1-bda2-4b0c-a070-73b2762f7771","Type":"ContainerDied","Data":"3e3c152e34432629aacd29867b5dabfa42a9033cf17534320a2f64fe4d26a524"} Apr 23 14:20:35.218270 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.218251 2582 scope.go:117] "RemoveContainer" containerID="e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991" Apr 23 14:20:35.228700 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.228680 2582 scope.go:117] "RemoveContainer" containerID="1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2" Apr 23 14:20:35.236382 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.236363 2582 scope.go:117] "RemoveContainer" containerID="c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e" Apr 23 14:20:35.242856 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.242833 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx"] Apr 23 14:20:35.243654 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.243580 2582 scope.go:117] "RemoveContainer" containerID="e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991" Apr 23 14:20:35.243942 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:20:35.243867 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991\": container with ID starting with e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991 not found: ID does not exist" containerID="e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991" Apr 23 14:20:35.243942 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.243904 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991"} err="failed to get container status \"e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991\": rpc error: code = NotFound desc = could not find container \"e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991\": container with ID starting with e7e2cd87bc02dbabced4dd7a736719118e3250a63d85900f0504d86038dca991 not found: ID does not exist" Apr 23 14:20:35.244081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.243946 2582 scope.go:117] "RemoveContainer" containerID="1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2" Apr 23 14:20:35.244199 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:20:35.244182 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2\": container with ID starting with 1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2 not found: ID does not exist" containerID="1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2" Apr 23 14:20:35.244281 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.244203 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2"} err="failed to get container status \"1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2\": rpc error: code = NotFound desc = could not find container \"1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2\": container with ID starting with 1be76ad444e3eec302393902f1c52b90edb281f39876fc02da827d39948224f2 not found: ID does not exist" Apr 23 14:20:35.244281 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.244216 2582 scope.go:117] "RemoveContainer" containerID="c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e" Apr 23 14:20:35.244520 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:20:35.244502 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e\": container with ID starting with c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e not found: ID does not exist" containerID="c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e" Apr 23 14:20:35.244580 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.244525 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e"} err="failed to get container status \"c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e\": rpc error: code = NotFound desc = could not find container \"c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e\": container with ID starting with c1ad8a72d51e37e122fdfbd1adda4905c86c2597544d637316f09612eaebce6e not found: ID does not exist" Apr 23 14:20:35.245704 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.245681 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-b4bhx"] Apr 23 14:20:35.669752 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:35.669713 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" path="/var/lib/kubelet/pods/84f989c1-bda2-4b0c-a070-73b2762f7771/volumes" Apr 23 14:20:45.375565 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375528 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p"] Apr 23 14:20:45.376052 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375799 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kserve-container" Apr 23 14:20:45.376052 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375815 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kserve-container" Apr 23 14:20:45.376052 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375832 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" Apr 23 14:20:45.376052 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375838 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" Apr 23 14:20:45.376052 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375848 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="storage-initializer" Apr 23 14:20:45.376052 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375854 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="storage-initializer" Apr 23 14:20:45.376052 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375906 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kserve-container" Apr 23 14:20:45.376052 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.375928 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="84f989c1-bda2-4b0c-a070-73b2762f7771" containerName="kube-rbac-proxy" Apr 23 14:20:45.378945 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.378928 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.381621 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.381580 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 23 14:20:45.381621 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.381612 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 23 14:20:45.381621 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.381618 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:20:45.381867 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.381661 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:20:45.382761 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.382746 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:20:45.390906 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.390885 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p"] Apr 23 14:20:45.519715 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.519674 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjhb5\" (UniqueName: \"kubernetes.io/projected/0795b785-58e2-489a-9bee-a978c1b5e2f0-kube-api-access-rjhb5\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.519943 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.519740 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0795b785-58e2-489a-9bee-a978c1b5e2f0-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.519943 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.519777 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0795b785-58e2-489a-9bee-a978c1b5e2f0-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.519943 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.519804 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0795b785-58e2-489a-9bee-a978c1b5e2f0-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.620501 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.620462 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0795b785-58e2-489a-9bee-a978c1b5e2f0-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.620501 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.620499 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0795b785-58e2-489a-9bee-a978c1b5e2f0-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.620769 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.620540 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjhb5\" (UniqueName: \"kubernetes.io/projected/0795b785-58e2-489a-9bee-a978c1b5e2f0-kube-api-access-rjhb5\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.620769 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.620575 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0795b785-58e2-489a-9bee-a978c1b5e2f0-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.621047 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.621024 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0795b785-58e2-489a-9bee-a978c1b5e2f0-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.621304 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.621287 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0795b785-58e2-489a-9bee-a978c1b5e2f0-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.623163 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.623134 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0795b785-58e2-489a-9bee-a978c1b5e2f0-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.629795 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.629774 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjhb5\" (UniqueName: \"kubernetes.io/projected/0795b785-58e2-489a-9bee-a978c1b5e2f0-kube-api-access-rjhb5\") pod \"isvc-triton-predictor-84bb65d94b-9qb2p\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.692068 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.692033 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:20:45.817403 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.817370 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p"] Apr 23 14:20:45.820739 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:20:45.820707 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0795b785_58e2_489a_9bee_a978c1b5e2f0.slice/crio-fdc00fca5d4cd1c6d5a232251b1c1419fa33cef42d5b369cb0121f2f4f5dd39a WatchSource:0}: Error finding container fdc00fca5d4cd1c6d5a232251b1c1419fa33cef42d5b369cb0121f2f4f5dd39a: Status 404 returned error can't find the container with id fdc00fca5d4cd1c6d5a232251b1c1419fa33cef42d5b369cb0121f2f4f5dd39a Apr 23 14:20:45.822561 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:45.822538 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:20:46.250929 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:46.250829 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" event={"ID":"0795b785-58e2-489a-9bee-a978c1b5e2f0","Type":"ContainerStarted","Data":"87407ab15662c04d304f4a87cee12395b51827a9102c932278ae3377ab9f8f7f"} Apr 23 14:20:46.250929 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:46.250867 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" event={"ID":"0795b785-58e2-489a-9bee-a978c1b5e2f0","Type":"ContainerStarted","Data":"fdc00fca5d4cd1c6d5a232251b1c1419fa33cef42d5b369cb0121f2f4f5dd39a"} Apr 23 14:20:50.264740 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:50.264699 2582 generic.go:358] "Generic (PLEG): container finished" podID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerID="87407ab15662c04d304f4a87cee12395b51827a9102c932278ae3377ab9f8f7f" exitCode=0 Apr 23 14:20:50.265133 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:20:50.264773 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" event={"ID":"0795b785-58e2-489a-9bee-a978c1b5e2f0","Type":"ContainerDied","Data":"87407ab15662c04d304f4a87cee12395b51827a9102c932278ae3377ab9f8f7f"} Apr 23 14:22:44.419044 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:44.419013 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:22:44.419615 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:44.419599 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:22:45.638985 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:45.638893 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" event={"ID":"0795b785-58e2-489a-9bee-a978c1b5e2f0","Type":"ContainerStarted","Data":"de34d29755950f7df30d1e79c553361b20a1b7fa4bf95e01cb96559aa02e623e"} Apr 23 14:22:45.638985 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:45.638954 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" event={"ID":"0795b785-58e2-489a-9bee-a978c1b5e2f0","Type":"ContainerStarted","Data":"7b4dfec8827817a606ea3f213f169afa736fc860e082fe4eaf8c777c72ef71b2"} Apr 23 14:22:46.642816 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:46.642771 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:22:46.671600 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:46.671547 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" podStartSLOduration=6.550562026 podStartE2EDuration="2m1.671532383s" podCreationTimestamp="2026-04-23 14:20:45 +0000 UTC" firstStartedPulling="2026-04-23 14:20:50.265836261 +0000 UTC m=+2953.253077121" lastFinishedPulling="2026-04-23 14:22:45.386806621 +0000 UTC m=+3068.374047478" observedRunningTime="2026-04-23 14:22:46.671105066 +0000 UTC m=+3069.658345946" watchObservedRunningTime="2026-04-23 14:22:46.671532383 +0000 UTC m=+3069.658773262" Apr 23 14:22:47.647309 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:47.647268 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:22:47.648807 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:47.648749 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 14:22:48.648880 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:48.648839 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 14:22:53.653620 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:53.653591 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:22:53.654323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:53.654305 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:22:57.543110 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:57.543070 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p"] Apr 23 14:22:57.543571 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:57.543477 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kserve-container" containerID="cri-o://7b4dfec8827817a606ea3f213f169afa736fc860e082fe4eaf8c777c72ef71b2" gracePeriod=30 Apr 23 14:22:57.543637 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:57.543618 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kube-rbac-proxy" containerID="cri-o://de34d29755950f7df30d1e79c553361b20a1b7fa4bf95e01cb96559aa02e623e" gracePeriod=30 Apr 23 14:22:57.673777 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:57.673746 2582 generic.go:358] "Generic (PLEG): container finished" podID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerID="de34d29755950f7df30d1e79c553361b20a1b7fa4bf95e01cb96559aa02e623e" exitCode=2 Apr 23 14:22:57.673944 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:57.673791 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" event={"ID":"0795b785-58e2-489a-9bee-a978c1b5e2f0","Type":"ContainerDied","Data":"de34d29755950f7df30d1e79c553361b20a1b7fa4bf95e01cb96559aa02e623e"} Apr 23 14:22:58.649584 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:22:58.649537 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 23 14:23:00.684565 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.684536 2582 generic.go:358] "Generic (PLEG): container finished" podID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerID="7b4dfec8827817a606ea3f213f169afa736fc860e082fe4eaf8c777c72ef71b2" exitCode=0 Apr 23 14:23:00.684881 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.684592 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" event={"ID":"0795b785-58e2-489a-9bee-a978c1b5e2f0","Type":"ContainerDied","Data":"7b4dfec8827817a606ea3f213f169afa736fc860e082fe4eaf8c777c72ef71b2"} Apr 23 14:23:00.972653 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.972628 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:23:00.989385 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.989364 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0795b785-58e2-489a-9bee-a978c1b5e2f0-kserve-provision-location\") pod \"0795b785-58e2-489a-9bee-a978c1b5e2f0\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " Apr 23 14:23:00.989496 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.989402 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0795b785-58e2-489a-9bee-a978c1b5e2f0-proxy-tls\") pod \"0795b785-58e2-489a-9bee-a978c1b5e2f0\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " Apr 23 14:23:00.989496 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.989432 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0795b785-58e2-489a-9bee-a978c1b5e2f0-isvc-triton-kube-rbac-proxy-sar-config\") pod \"0795b785-58e2-489a-9bee-a978c1b5e2f0\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " Apr 23 14:23:00.989496 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.989453 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjhb5\" (UniqueName: \"kubernetes.io/projected/0795b785-58e2-489a-9bee-a978c1b5e2f0-kube-api-access-rjhb5\") pod \"0795b785-58e2-489a-9bee-a978c1b5e2f0\" (UID: \"0795b785-58e2-489a-9bee-a978c1b5e2f0\") " Apr 23 14:23:00.989800 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.989767 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0795b785-58e2-489a-9bee-a978c1b5e2f0-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "0795b785-58e2-489a-9bee-a978c1b5e2f0" (UID: "0795b785-58e2-489a-9bee-a978c1b5e2f0"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:23:00.989907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.989822 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0795b785-58e2-489a-9bee-a978c1b5e2f0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0795b785-58e2-489a-9bee-a978c1b5e2f0" (UID: "0795b785-58e2-489a-9bee-a978c1b5e2f0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:23:00.991669 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.991641 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0795b785-58e2-489a-9bee-a978c1b5e2f0-kube-api-access-rjhb5" (OuterVolumeSpecName: "kube-api-access-rjhb5") pod "0795b785-58e2-489a-9bee-a978c1b5e2f0" (UID: "0795b785-58e2-489a-9bee-a978c1b5e2f0"). InnerVolumeSpecName "kube-api-access-rjhb5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:23:00.991759 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:00.991738 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0795b785-58e2-489a-9bee-a978c1b5e2f0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0795b785-58e2-489a-9bee-a978c1b5e2f0" (UID: "0795b785-58e2-489a-9bee-a978c1b5e2f0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:23:01.090414 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.090381 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0795b785-58e2-489a-9bee-a978c1b5e2f0-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:23:01.090414 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.090414 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0795b785-58e2-489a-9bee-a978c1b5e2f0-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:23:01.090654 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.090429 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0795b785-58e2-489a-9bee-a978c1b5e2f0-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:23:01.090654 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.090438 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rjhb5\" (UniqueName: \"kubernetes.io/projected/0795b785-58e2-489a-9bee-a978c1b5e2f0-kube-api-access-rjhb5\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:23:01.689168 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.689134 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" event={"ID":"0795b785-58e2-489a-9bee-a978c1b5e2f0","Type":"ContainerDied","Data":"fdc00fca5d4cd1c6d5a232251b1c1419fa33cef42d5b369cb0121f2f4f5dd39a"} Apr 23 14:23:01.689168 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.689155 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p" Apr 23 14:23:01.689655 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.689179 2582 scope.go:117] "RemoveContainer" containerID="de34d29755950f7df30d1e79c553361b20a1b7fa4bf95e01cb96559aa02e623e" Apr 23 14:23:01.696981 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.696905 2582 scope.go:117] "RemoveContainer" containerID="7b4dfec8827817a606ea3f213f169afa736fc860e082fe4eaf8c777c72ef71b2" Apr 23 14:23:01.703939 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.703908 2582 scope.go:117] "RemoveContainer" containerID="87407ab15662c04d304f4a87cee12395b51827a9102c932278ae3377ab9f8f7f" Apr 23 14:23:01.708324 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.708300 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p"] Apr 23 14:23:01.711871 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:01.711829 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-9qb2p"] Apr 23 14:23:03.669734 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:23:03.669705 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" path="/var/lib/kubelet/pods/0795b785-58e2-489a-9bee-a978c1b5e2f0/volumes" Apr 23 14:24:37.886371 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886333 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8"] Apr 23 14:24:37.886907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886686 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="storage-initializer" Apr 23 14:24:37.886907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886701 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="storage-initializer" Apr 23 14:24:37.886907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886712 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kube-rbac-proxy" Apr 23 14:24:37.886907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886718 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kube-rbac-proxy" Apr 23 14:24:37.886907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886734 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kserve-container" Apr 23 14:24:37.886907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886741 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kserve-container" Apr 23 14:24:37.886907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886783 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kserve-container" Apr 23 14:24:37.886907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.886792 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="0795b785-58e2-489a-9bee-a978c1b5e2f0" containerName="kube-rbac-proxy" Apr 23 14:24:37.889823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.889807 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:37.892583 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.892554 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t7497\"" Apr 23 14:24:37.892687 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.892596 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 23 14:24:37.892687 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.892610 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 14:24:37.892761 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.892559 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 14:24:37.892761 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.892554 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 14:24:37.899066 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.899040 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8"] Apr 23 14:24:37.917312 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.917275 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbbca23-5e2e-4155-9621-f37690318d5f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:37.917480 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.917336 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:37.917480 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.917364 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cbbca23-5e2e-4155-9621-f37690318d5f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:37.917480 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:37.917395 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjjn5\" (UniqueName: \"kubernetes.io/projected/0cbbca23-5e2e-4155-9621-f37690318d5f-kube-api-access-hjjn5\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.018722 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.018682 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbbca23-5e2e-4155-9621-f37690318d5f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.018722 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.018727 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.019039 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.018746 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cbbca23-5e2e-4155-9621-f37690318d5f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.019039 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.018769 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjjn5\" (UniqueName: \"kubernetes.io/projected/0cbbca23-5e2e-4155-9621-f37690318d5f-kube-api-access-hjjn5\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.019039 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:24:38.018855 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-serving-cert: secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 23 14:24:38.019039 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:24:38.018967 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls podName:0cbbca23-5e2e-4155-9621-f37690318d5f nodeName:}" failed. No retries permitted until 2026-04-23 14:24:38.518913408 +0000 UTC m=+3181.506154267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls") pod "isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" (UID: "0cbbca23-5e2e-4155-9621-f37690318d5f") : secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 23 14:24:38.019257 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.019177 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbbca23-5e2e-4155-9621-f37690318d5f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.019527 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.019504 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cbbca23-5e2e-4155-9621-f37690318d5f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.028068 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.028042 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjjn5\" (UniqueName: \"kubernetes.io/projected/0cbbca23-5e2e-4155-9621-f37690318d5f-kube-api-access-hjjn5\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.523958 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.523892 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.526492 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.526467 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.802102 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.802013 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:38.923203 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.923102 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8"] Apr 23 14:24:38.925798 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:24:38.925759 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cbbca23_5e2e_4155_9621_f37690318d5f.slice/crio-d409d607b2ca8be56481502bd33849cefc24bcf26a259b87d697148270ad8086 WatchSource:0}: Error finding container d409d607b2ca8be56481502bd33849cefc24bcf26a259b87d697148270ad8086: Status 404 returned error can't find the container with id d409d607b2ca8be56481502bd33849cefc24bcf26a259b87d697148270ad8086 Apr 23 14:24:38.957501 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:38.957467 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" event={"ID":"0cbbca23-5e2e-4155-9621-f37690318d5f","Type":"ContainerStarted","Data":"d409d607b2ca8be56481502bd33849cefc24bcf26a259b87d697148270ad8086"} Apr 23 14:24:39.961443 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:39.961402 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" event={"ID":"0cbbca23-5e2e-4155-9621-f37690318d5f","Type":"ContainerStarted","Data":"2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc"} Apr 23 14:24:42.971604 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:42.971569 2582 generic.go:358] "Generic (PLEG): container finished" podID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerID="2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc" exitCode=0 Apr 23 14:24:42.972032 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:42.971648 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" event={"ID":"0cbbca23-5e2e-4155-9621-f37690318d5f","Type":"ContainerDied","Data":"2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc"} Apr 23 14:24:43.976025 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:43.975990 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" event={"ID":"0cbbca23-5e2e-4155-9621-f37690318d5f","Type":"ContainerStarted","Data":"9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46"} Apr 23 14:24:43.976025 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:43.976030 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" event={"ID":"0cbbca23-5e2e-4155-9621-f37690318d5f","Type":"ContainerStarted","Data":"eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a"} Apr 23 14:24:43.976479 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:43.976235 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:43.996589 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:43.996532 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" podStartSLOduration=6.99651754 podStartE2EDuration="6.99651754s" podCreationTimestamp="2026-04-23 14:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:24:43.994746258 +0000 UTC m=+3186.981987137" watchObservedRunningTime="2026-04-23 14:24:43.99651754 +0000 UTC m=+3186.983758480" Apr 23 14:24:44.979608 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:44.979561 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:24:50.988404 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:24:50.988364 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:25:20.992321 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:20.992235 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:25:27.909937 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:27.909890 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8"] Apr 23 14:25:27.910322 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:27.910241 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kserve-container" containerID="cri-o://eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a" gracePeriod=30 Apr 23 14:25:27.910395 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:27.910291 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kube-rbac-proxy" containerID="cri-o://9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46" gracePeriod=30 Apr 23 14:25:28.000447 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.000413 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz"] Apr 23 14:25:28.003862 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.003836 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.006190 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.006165 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 14:25:28.006305 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.006216 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 23 14:25:28.012274 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.012099 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz"] Apr 23 14:25:28.105262 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.105231 2582 generic.go:358] "Generic (PLEG): container finished" podID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerID="9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46" exitCode=2 Apr 23 14:25:28.105456 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.105275 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" event={"ID":"0cbbca23-5e2e-4155-9621-f37690318d5f","Type":"ContainerDied","Data":"9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46"} Apr 23 14:25:28.107585 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.107563 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8798bb75-2569-492b-bd57-ad61e81d10f4-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.107661 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.107604 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8798bb75-2569-492b-bd57-ad61e81d10f4-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.107661 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.107622 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.107765 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.107699 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rn5\" (UniqueName: \"kubernetes.io/projected/8798bb75-2569-492b-bd57-ad61e81d10f4-kube-api-access-t4rn5\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.209087 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.209003 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8798bb75-2569-492b-bd57-ad61e81d10f4-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.209087 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.209044 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.209087 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.209077 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4rn5\" (UniqueName: \"kubernetes.io/projected/8798bb75-2569-492b-bd57-ad61e81d10f4-kube-api-access-t4rn5\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.209361 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.209129 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8798bb75-2569-492b-bd57-ad61e81d10f4-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.209361 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:25:28.209201 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-serving-cert: secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 23 14:25:28.209361 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:25:28.209310 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls podName:8798bb75-2569-492b-bd57-ad61e81d10f4 nodeName:}" failed. No retries permitted until 2026-04-23 14:25:28.709259164 +0000 UTC m=+3231.696500023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls") pod "xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" (UID: "8798bb75-2569-492b-bd57-ad61e81d10f4") : secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 23 14:25:28.209361 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.209348 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8798bb75-2569-492b-bd57-ad61e81d10f4-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.209769 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.209750 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8798bb75-2569-492b-bd57-ad61e81d10f4-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.219289 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.219267 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4rn5\" (UniqueName: \"kubernetes.io/projected/8798bb75-2569-492b-bd57-ad61e81d10f4-kube-api-access-t4rn5\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.712466 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.712427 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.714979 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.714952 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-2ckrz\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:28.916241 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:28.916204 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:29.096115 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:29.095952 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz"] Apr 23 14:25:29.098246 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:25:29.098220 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8798bb75_2569_492b_bd57_ad61e81d10f4.slice/crio-cc78e98ff1d1ca12168477e519ccb8adf893632ccc0509424ed52a01c0d91579 WatchSource:0}: Error finding container cc78e98ff1d1ca12168477e519ccb8adf893632ccc0509424ed52a01c0d91579: Status 404 returned error can't find the container with id cc78e98ff1d1ca12168477e519ccb8adf893632ccc0509424ed52a01c0d91579 Apr 23 14:25:29.109727 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:29.109693 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" event={"ID":"8798bb75-2569-492b-bd57-ad61e81d10f4","Type":"ContainerStarted","Data":"cc78e98ff1d1ca12168477e519ccb8adf893632ccc0509424ed52a01c0d91579"} Apr 23 14:25:30.113454 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:30.113409 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" event={"ID":"8798bb75-2569-492b-bd57-ad61e81d10f4","Type":"ContainerStarted","Data":"8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a"} Apr 23 14:25:30.983499 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:30.983449 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.45:8643/healthz\": dial tcp 10.133.0.45:8643: connect: connection refused" Apr 23 14:25:30.989906 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:30.989862 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 14:25:33.123955 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:33.123895 2582 generic.go:358] "Generic (PLEG): container finished" podID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerID="8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a" exitCode=0 Apr 23 14:25:33.124345 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:33.123973 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" event={"ID":"8798bb75-2569-492b-bd57-ad61e81d10f4","Type":"ContainerDied","Data":"8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a"} Apr 23 14:25:34.128933 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.128885 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" event={"ID":"8798bb75-2569-492b-bd57-ad61e81d10f4","Type":"ContainerStarted","Data":"e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b"} Apr 23 14:25:34.129359 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.128945 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" event={"ID":"8798bb75-2569-492b-bd57-ad61e81d10f4","Type":"ContainerStarted","Data":"6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4"} Apr 23 14:25:34.129359 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.129170 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:34.129359 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.129285 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:25:34.151035 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.150987 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" podStartSLOduration=7.150973723 podStartE2EDuration="7.150973723s" podCreationTimestamp="2026-04-23 14:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:25:34.148949814 +0000 UTC m=+3237.136190693" watchObservedRunningTime="2026-04-23 14:25:34.150973723 +0000 UTC m=+3237.138214601" Apr 23 14:25:34.444908 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.444879 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:25:34.558432 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.558395 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls\") pod \"0cbbca23-5e2e-4155-9621-f37690318d5f\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " Apr 23 14:25:34.558432 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.558436 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbbca23-5e2e-4155-9621-f37690318d5f-kserve-provision-location\") pod \"0cbbca23-5e2e-4155-9621-f37690318d5f\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " Apr 23 14:25:34.558683 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.558479 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cbbca23-5e2e-4155-9621-f37690318d5f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"0cbbca23-5e2e-4155-9621-f37690318d5f\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " Apr 23 14:25:34.558683 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.558498 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjjn5\" (UniqueName: \"kubernetes.io/projected/0cbbca23-5e2e-4155-9621-f37690318d5f-kube-api-access-hjjn5\") pod \"0cbbca23-5e2e-4155-9621-f37690318d5f\" (UID: \"0cbbca23-5e2e-4155-9621-f37690318d5f\") " Apr 23 14:25:34.558845 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.558814 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbbca23-5e2e-4155-9621-f37690318d5f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0cbbca23-5e2e-4155-9621-f37690318d5f" (UID: "0cbbca23-5e2e-4155-9621-f37690318d5f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:25:34.558909 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.558888 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cbbca23-5e2e-4155-9621-f37690318d5f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "0cbbca23-5e2e-4155-9621-f37690318d5f" (UID: "0cbbca23-5e2e-4155-9621-f37690318d5f"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:25:34.560852 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.560826 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbbca23-5e2e-4155-9621-f37690318d5f-kube-api-access-hjjn5" (OuterVolumeSpecName: "kube-api-access-hjjn5") pod "0cbbca23-5e2e-4155-9621-f37690318d5f" (UID: "0cbbca23-5e2e-4155-9621-f37690318d5f"). InnerVolumeSpecName "kube-api-access-hjjn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:25:34.560852 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.560841 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0cbbca23-5e2e-4155-9621-f37690318d5f" (UID: "0cbbca23-5e2e-4155-9621-f37690318d5f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:25:34.659005 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.658968 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cbbca23-5e2e-4155-9621-f37690318d5f-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:25:34.659005 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.658999 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbbca23-5e2e-4155-9621-f37690318d5f-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:25:34.659005 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.659009 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cbbca23-5e2e-4155-9621-f37690318d5f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:25:34.659225 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:34.659021 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjjn5\" (UniqueName: \"kubernetes.io/projected/0cbbca23-5e2e-4155-9621-f37690318d5f-kube-api-access-hjjn5\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:25:35.133528 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.133489 2582 generic.go:358] "Generic (PLEG): container finished" podID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerID="eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a" exitCode=0 Apr 23 14:25:35.133998 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.133554 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" event={"ID":"0cbbca23-5e2e-4155-9621-f37690318d5f","Type":"ContainerDied","Data":"eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a"} Apr 23 14:25:35.133998 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.133591 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" event={"ID":"0cbbca23-5e2e-4155-9621-f37690318d5f","Type":"ContainerDied","Data":"d409d607b2ca8be56481502bd33849cefc24bcf26a259b87d697148270ad8086"} Apr 23 14:25:35.133998 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.133590 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8" Apr 23 14:25:35.133998 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.133607 2582 scope.go:117] "RemoveContainer" containerID="9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46" Apr 23 14:25:35.141499 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.141475 2582 scope.go:117] "RemoveContainer" containerID="eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a" Apr 23 14:25:35.148563 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.148545 2582 scope.go:117] "RemoveContainer" containerID="2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc" Apr 23 14:25:35.155175 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.155154 2582 scope.go:117] "RemoveContainer" containerID="9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46" Apr 23 14:25:35.155418 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:25:35.155401 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46\": container with ID starting with 9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46 not found: ID does not exist" containerID="9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46" Apr 23 14:25:35.155482 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.155425 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46"} err="failed to get container status \"9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46\": rpc error: code = NotFound desc = could not find container \"9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46\": container with ID starting with 9142351b7c154b98d51822fc2d458ea295aaa755f71807e31a5aaf47c5f7cd46 not found: ID does not exist" Apr 23 14:25:35.155482 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.155441 2582 scope.go:117] "RemoveContainer" containerID="eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a" Apr 23 14:25:35.155734 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:25:35.155694 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a\": container with ID starting with eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a not found: ID does not exist" containerID="eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a" Apr 23 14:25:35.155798 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.155728 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a"} err="failed to get container status \"eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a\": rpc error: code = NotFound desc = could not find container \"eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a\": container with ID starting with eb0ef2b567b6ab4804ca2a97e77230f3cdce4ef5005e7573bbd8e03d8061545a not found: ID does not exist" Apr 23 14:25:35.155798 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.155745 2582 scope.go:117] "RemoveContainer" containerID="2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc" Apr 23 14:25:35.156095 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:25:35.156057 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc\": container with ID starting with 2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc not found: ID does not exist" containerID="2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc" Apr 23 14:25:35.156155 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.156103 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc"} err="failed to get container status \"2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc\": rpc error: code = NotFound desc = could not find container \"2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc\": container with ID starting with 2cf6634d14acc6792adb6db2844278428104d7eca462e1d8da29fdba09a1c3dc not found: ID does not exist" Apr 23 14:25:35.157501 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.157484 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8"] Apr 23 14:25:35.163676 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.163657 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-5kbn8"] Apr 23 14:25:35.669410 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:35.669377 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" path="/var/lib/kubelet/pods/0cbbca23-5e2e-4155-9621-f37690318d5f/volumes" Apr 23 14:25:40.139824 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:25:40.139795 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:26:10.143323 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:10.143292 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:26:18.076626 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.076587 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz"] Apr 23 14:26:18.077179 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.077000 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kserve-container" containerID="cri-o://6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4" gracePeriod=30 Apr 23 14:26:18.077179 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.077067 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kube-rbac-proxy" containerID="cri-o://e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b" gracePeriod=30 Apr 23 14:26:18.146241 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146207 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h"] Apr 23 14:26:18.146509 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146496 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kube-rbac-proxy" Apr 23 14:26:18.146555 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146511 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kube-rbac-proxy" Apr 23 14:26:18.146555 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146525 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="storage-initializer" Apr 23 14:26:18.146555 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146531 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="storage-initializer" Apr 23 14:26:18.146555 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146538 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kserve-container" Apr 23 14:26:18.146555 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146547 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kserve-container" Apr 23 14:26:18.146705 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146602 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kube-rbac-proxy" Apr 23 14:26:18.146705 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.146613 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cbbca23-5e2e-4155-9621-f37690318d5f" containerName="kserve-container" Apr 23 14:26:18.149664 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.149647 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.152281 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.152257 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 23 14:26:18.152396 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.152370 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 23 14:26:18.161212 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.161186 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h"] Apr 23 14:26:18.180637 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.180605 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tchj\" (UniqueName: \"kubernetes.io/projected/25d4bb85-7096-462b-8474-85bd4d407377-kube-api-access-5tchj\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.180823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.180662 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.180823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.180741 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d4bb85-7096-462b-8474-85bd4d407377-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.180823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.180783 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d4bb85-7096-462b-8474-85bd4d407377-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.259308 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.259269 2582 generic.go:358] "Generic (PLEG): container finished" podID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerID="e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b" exitCode=2 Apr 23 14:26:18.259484 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.259342 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" event={"ID":"8798bb75-2569-492b-bd57-ad61e81d10f4","Type":"ContainerDied","Data":"e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b"} Apr 23 14:26:18.281680 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.281650 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.281828 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.281690 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d4bb85-7096-462b-8474-85bd4d407377-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.281828 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.281710 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d4bb85-7096-462b-8474-85bd4d407377-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.281828 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.281754 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tchj\" (UniqueName: \"kubernetes.io/projected/25d4bb85-7096-462b-8474-85bd4d407377-kube-api-access-5tchj\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.281828 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:26:18.281814 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-serving-cert: secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 23 14:26:18.282040 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:26:18.281896 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls podName:25d4bb85-7096-462b-8474-85bd4d407377 nodeName:}" failed. No retries permitted until 2026-04-23 14:26:18.781876181 +0000 UTC m=+3281.769117038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls") pod "isvc-xgboost-runtime-predictor-779db84d9-jqv9h" (UID: "25d4bb85-7096-462b-8474-85bd4d407377") : secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 23 14:26:18.282230 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.282210 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d4bb85-7096-462b-8474-85bd4d407377-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.282347 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.282329 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d4bb85-7096-462b-8474-85bd4d407377-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.293092 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.293066 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tchj\" (UniqueName: \"kubernetes.io/projected/25d4bb85-7096-462b-8474-85bd4d407377-kube-api-access-5tchj\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.785452 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.785411 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:18.788068 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:18.788036 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-jqv9h\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:19.060384 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:19.060290 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:19.188597 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:19.188563 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h"] Apr 23 14:26:19.191640 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:26:19.191611 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d4bb85_7096_462b_8474_85bd4d407377.slice/crio-26fe80d7678c73e60a7cd05af41e9599574b5954ea3842c6b7b69e092f6f1152 WatchSource:0}: Error finding container 26fe80d7678c73e60a7cd05af41e9599574b5954ea3842c6b7b69e092f6f1152: Status 404 returned error can't find the container with id 26fe80d7678c73e60a7cd05af41e9599574b5954ea3842c6b7b69e092f6f1152 Apr 23 14:26:19.193603 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:19.193585 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:26:19.263020 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:19.262987 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" event={"ID":"25d4bb85-7096-462b-8474-85bd4d407377","Type":"ContainerStarted","Data":"6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122"} Apr 23 14:26:19.263020 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:19.263023 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" event={"ID":"25d4bb85-7096-462b-8474-85bd4d407377","Type":"ContainerStarted","Data":"26fe80d7678c73e60a7cd05af41e9599574b5954ea3842c6b7b69e092f6f1152"} Apr 23 14:26:20.134738 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:20.134684 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.46:8643/healthz\": dial tcp 10.133.0.46:8643: connect: connection refused" Apr 23 14:26:23.275304 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:23.275271 2582 generic.go:358] "Generic (PLEG): container finished" podID="25d4bb85-7096-462b-8474-85bd4d407377" containerID="6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122" exitCode=0 Apr 23 14:26:23.275696 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:23.275353 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" event={"ID":"25d4bb85-7096-462b-8474-85bd4d407377","Type":"ContainerDied","Data":"6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122"} Apr 23 14:26:24.853520 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.853491 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:26:24.935497 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.935400 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls\") pod \"8798bb75-2569-492b-bd57-ad61e81d10f4\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " Apr 23 14:26:24.935497 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.935453 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4rn5\" (UniqueName: \"kubernetes.io/projected/8798bb75-2569-492b-bd57-ad61e81d10f4-kube-api-access-t4rn5\") pod \"8798bb75-2569-492b-bd57-ad61e81d10f4\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " Apr 23 14:26:24.935732 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.935544 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8798bb75-2569-492b-bd57-ad61e81d10f4-kserve-provision-location\") pod \"8798bb75-2569-492b-bd57-ad61e81d10f4\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " Apr 23 14:26:24.935732 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.935586 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8798bb75-2569-492b-bd57-ad61e81d10f4-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"8798bb75-2569-492b-bd57-ad61e81d10f4\" (UID: \"8798bb75-2569-492b-bd57-ad61e81d10f4\") " Apr 23 14:26:24.935979 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.935909 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8798bb75-2569-492b-bd57-ad61e81d10f4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8798bb75-2569-492b-bd57-ad61e81d10f4" (UID: "8798bb75-2569-492b-bd57-ad61e81d10f4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:26:24.935979 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.935976 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8798bb75-2569-492b-bd57-ad61e81d10f4-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "8798bb75-2569-492b-bd57-ad61e81d10f4" (UID: "8798bb75-2569-492b-bd57-ad61e81d10f4"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:26:24.937703 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.937682 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8798bb75-2569-492b-bd57-ad61e81d10f4" (UID: "8798bb75-2569-492b-bd57-ad61e81d10f4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:26:24.937859 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:24.937730 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8798bb75-2569-492b-bd57-ad61e81d10f4-kube-api-access-t4rn5" (OuterVolumeSpecName: "kube-api-access-t4rn5") pod "8798bb75-2569-492b-bd57-ad61e81d10f4" (UID: "8798bb75-2569-492b-bd57-ad61e81d10f4"). InnerVolumeSpecName "kube-api-access-t4rn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:26:25.036988 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.036946 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8798bb75-2569-492b-bd57-ad61e81d10f4-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:26:25.036988 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.036980 2582 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8798bb75-2569-492b-bd57-ad61e81d10f4-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:26:25.036988 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.036991 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8798bb75-2569-492b-bd57-ad61e81d10f4-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:26:25.036988 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.037000 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4rn5\" (UniqueName: \"kubernetes.io/projected/8798bb75-2569-492b-bd57-ad61e81d10f4-kube-api-access-t4rn5\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:26:25.285089 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.284988 2582 generic.go:358] "Generic (PLEG): container finished" podID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerID="6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4" exitCode=0 Apr 23 14:26:25.285089 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.285037 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" event={"ID":"8798bb75-2569-492b-bd57-ad61e81d10f4","Type":"ContainerDied","Data":"6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4"} Apr 23 14:26:25.285089 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.285061 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" Apr 23 14:26:25.285089 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.285071 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz" event={"ID":"8798bb75-2569-492b-bd57-ad61e81d10f4","Type":"ContainerDied","Data":"cc78e98ff1d1ca12168477e519ccb8adf893632ccc0509424ed52a01c0d91579"} Apr 23 14:26:25.285089 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.285091 2582 scope.go:117] "RemoveContainer" containerID="e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b" Apr 23 14:26:25.293601 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.293579 2582 scope.go:117] "RemoveContainer" containerID="6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4" Apr 23 14:26:25.301081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.301061 2582 scope.go:117] "RemoveContainer" containerID="8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a" Apr 23 14:26:25.307825 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.307799 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz"] Apr 23 14:26:25.309600 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.309575 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-2ckrz"] Apr 23 14:26:25.312469 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.312445 2582 scope.go:117] "RemoveContainer" containerID="e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b" Apr 23 14:26:25.312778 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:26:25.312758 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b\": container with ID starting with e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b not found: ID does not exist" containerID="e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b" Apr 23 14:26:25.312844 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.312787 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b"} err="failed to get container status \"e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b\": rpc error: code = NotFound desc = could not find container \"e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b\": container with ID starting with e39074864799a58e883a278bd6ac34266dde840a572c758376d3944ea2e0197b not found: ID does not exist" Apr 23 14:26:25.312844 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.312809 2582 scope.go:117] "RemoveContainer" containerID="6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4" Apr 23 14:26:25.313088 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:26:25.313068 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4\": container with ID starting with 6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4 not found: ID does not exist" containerID="6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4" Apr 23 14:26:25.313161 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.313110 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4"} err="failed to get container status \"6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4\": rpc error: code = NotFound desc = could not find container \"6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4\": container with ID starting with 6b33432707c11fc405b9e33a262bb052e6b3ea8f7ffeb960339f36c32ab91cd4 not found: ID does not exist" Apr 23 14:26:25.313161 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.313133 2582 scope.go:117] "RemoveContainer" containerID="8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a" Apr 23 14:26:25.313353 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:26:25.313334 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a\": container with ID starting with 8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a not found: ID does not exist" containerID="8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a" Apr 23 14:26:25.313405 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.313360 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a"} err="failed to get container status \"8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a\": rpc error: code = NotFound desc = could not find container \"8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a\": container with ID starting with 8c0b933b34f2a26ffb3f28706f27edb82bd8d911ff7828bd885f45d9fcd7551a not found: ID does not exist" Apr 23 14:26:25.671217 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:25.671169 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" path="/var/lib/kubelet/pods/8798bb75-2569-492b-bd57-ad61e81d10f4/volumes" Apr 23 14:26:43.341059 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:43.341018 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" event={"ID":"25d4bb85-7096-462b-8474-85bd4d407377","Type":"ContainerStarted","Data":"8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04"} Apr 23 14:26:43.341059 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:43.341063 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" event={"ID":"25d4bb85-7096-462b-8474-85bd4d407377","Type":"ContainerStarted","Data":"bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c"} Apr 23 14:26:43.341572 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:43.341297 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:43.364970 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:43.364899 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podStartSLOduration=5.938044738 podStartE2EDuration="25.364884537s" podCreationTimestamp="2026-04-23 14:26:18 +0000 UTC" firstStartedPulling="2026-04-23 14:26:23.276576192 +0000 UTC m=+3286.263817049" lastFinishedPulling="2026-04-23 14:26:42.70341599 +0000 UTC m=+3305.690656848" observedRunningTime="2026-04-23 14:26:43.362351738 +0000 UTC m=+3306.349592617" watchObservedRunningTime="2026-04-23 14:26:43.364884537 +0000 UTC m=+3306.352125415" Apr 23 14:26:44.344135 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:44.344102 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:44.345450 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:44.345418 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:26:45.347147 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:45.347101 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:26:50.351229 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:50.351197 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:26:50.351822 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:26:50.351795 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:27:00.352509 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:00.352470 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:27:10.352492 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:10.352451 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:27:20.352078 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:20.352035 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:27:30.352536 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:30.352499 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:27:40.352582 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:40.352543 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:27:44.440090 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:44.440062 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:27:44.440564 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:44.440104 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:27:50.352112 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:50.352077 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:27:58.258081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.258039 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h"] Apr 23 14:27:58.258477 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.258360 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" containerID="cri-o://bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c" gracePeriod=30 Apr 23 14:27:58.258477 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.258412 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kube-rbac-proxy" containerID="cri-o://8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04" gracePeriod=30 Apr 23 14:27:58.370761 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.370723 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj"] Apr 23 14:27:58.371061 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.371047 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="storage-initializer" Apr 23 14:27:58.371128 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.371063 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="storage-initializer" Apr 23 14:27:58.371128 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.371073 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kube-rbac-proxy" Apr 23 14:27:58.371128 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.371080 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kube-rbac-proxy" Apr 23 14:27:58.371128 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.371102 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kserve-container" Apr 23 14:27:58.371128 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.371109 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kserve-container" Apr 23 14:27:58.371283 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.371155 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kserve-container" Apr 23 14:27:58.371283 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.371164 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="8798bb75-2569-492b-bd57-ad61e81d10f4" containerName="kube-rbac-proxy" Apr 23 14:27:58.374168 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.374150 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.376625 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.376601 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 14:27:58.376724 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.376600 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 23 14:27:58.383611 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.383585 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj"] Apr 23 14:27:58.438792 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.438756 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2654c476-4740-479e-bc60-fee229645bbb-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.439015 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.438800 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2654c476-4740-479e-bc60-fee229645bbb-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.439015 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.438832 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h466\" (UniqueName: \"kubernetes.io/projected/2654c476-4740-479e-bc60-fee229645bbb-kube-api-access-5h466\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.439015 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.438860 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2654c476-4740-479e-bc60-fee229645bbb-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.539501 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.539401 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2654c476-4740-479e-bc60-fee229645bbb-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.539501 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.539454 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2654c476-4740-479e-bc60-fee229645bbb-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.539501 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.539484 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h466\" (UniqueName: \"kubernetes.io/projected/2654c476-4740-479e-bc60-fee229645bbb-kube-api-access-5h466\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.539799 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.539514 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2654c476-4740-479e-bc60-fee229645bbb-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.539968 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.539943 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2654c476-4740-479e-bc60-fee229645bbb-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.540277 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.540261 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2654c476-4740-479e-bc60-fee229645bbb-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.542164 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.542141 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2654c476-4740-479e-bc60-fee229645bbb-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.548490 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.548469 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h466\" (UniqueName: \"kubernetes.io/projected/2654c476-4740-479e-bc60-fee229645bbb-kube-api-access-5h466\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.554560 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.554536 2582 generic.go:358] "Generic (PLEG): container finished" podID="25d4bb85-7096-462b-8474-85bd4d407377" containerID="8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04" exitCode=2 Apr 23 14:27:58.554662 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.554592 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" event={"ID":"25d4bb85-7096-462b-8474-85bd4d407377","Type":"ContainerDied","Data":"8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04"} Apr 23 14:27:58.685036 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.685000 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:27:58.815370 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:58.815294 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj"] Apr 23 14:27:58.818276 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:27:58.818248 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2654c476_4740_479e_bc60_fee229645bbb.slice/crio-299c11fecad453a883dd030cb40b091646f5660d0995341fd99625c72c2e5dd6 WatchSource:0}: Error finding container 299c11fecad453a883dd030cb40b091646f5660d0995341fd99625c72c2e5dd6: Status 404 returned error can't find the container with id 299c11fecad453a883dd030cb40b091646f5660d0995341fd99625c72c2e5dd6 Apr 23 14:27:59.559272 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:59.559227 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" event={"ID":"2654c476-4740-479e-bc60-fee229645bbb","Type":"ContainerStarted","Data":"44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011"} Apr 23 14:27:59.559272 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:27:59.559267 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" event={"ID":"2654c476-4740-479e-bc60-fee229645bbb","Type":"ContainerStarted","Data":"299c11fecad453a883dd030cb40b091646f5660d0995341fd99625c72c2e5dd6"} Apr 23 14:28:00.348085 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:00.348036 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 14:28:00.351820 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:00.351798 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 14:28:01.896659 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.896629 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:28:01.963492 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.963455 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls\") pod \"25d4bb85-7096-462b-8474-85bd4d407377\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " Apr 23 14:28:01.963700 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.963515 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d4bb85-7096-462b-8474-85bd4d407377-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"25d4bb85-7096-462b-8474-85bd4d407377\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " Apr 23 14:28:01.963700 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.963555 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tchj\" (UniqueName: \"kubernetes.io/projected/25d4bb85-7096-462b-8474-85bd4d407377-kube-api-access-5tchj\") pod \"25d4bb85-7096-462b-8474-85bd4d407377\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " Apr 23 14:28:01.963700 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.963641 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d4bb85-7096-462b-8474-85bd4d407377-kserve-provision-location\") pod \"25d4bb85-7096-462b-8474-85bd4d407377\" (UID: \"25d4bb85-7096-462b-8474-85bd4d407377\") " Apr 23 14:28:01.963992 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.963966 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d4bb85-7096-462b-8474-85bd4d407377-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "25d4bb85-7096-462b-8474-85bd4d407377" (UID: "25d4bb85-7096-462b-8474-85bd4d407377"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:28:01.964065 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.963992 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d4bb85-7096-462b-8474-85bd4d407377-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "25d4bb85-7096-462b-8474-85bd4d407377" (UID: "25d4bb85-7096-462b-8474-85bd4d407377"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:28:01.965829 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.965807 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d4bb85-7096-462b-8474-85bd4d407377-kube-api-access-5tchj" (OuterVolumeSpecName: "kube-api-access-5tchj") pod "25d4bb85-7096-462b-8474-85bd4d407377" (UID: "25d4bb85-7096-462b-8474-85bd4d407377"). InnerVolumeSpecName "kube-api-access-5tchj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:28:01.965882 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:01.965812 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "25d4bb85-7096-462b-8474-85bd4d407377" (UID: "25d4bb85-7096-462b-8474-85bd4d407377"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:28:02.064561 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.064459 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tchj\" (UniqueName: \"kubernetes.io/projected/25d4bb85-7096-462b-8474-85bd4d407377-kube-api-access-5tchj\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:28:02.064561 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.064503 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d4bb85-7096-462b-8474-85bd4d407377-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:28:02.064561 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.064514 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d4bb85-7096-462b-8474-85bd4d407377-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:28:02.064561 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.064525 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d4bb85-7096-462b-8474-85bd4d407377-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:28:02.569491 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.569453 2582 generic.go:358] "Generic (PLEG): container finished" podID="2654c476-4740-479e-bc60-fee229645bbb" containerID="44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011" exitCode=0 Apr 23 14:28:02.569674 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.569497 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" event={"ID":"2654c476-4740-479e-bc60-fee229645bbb","Type":"ContainerDied","Data":"44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011"} Apr 23 14:28:02.571296 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.571274 2582 generic.go:358] "Generic (PLEG): container finished" podID="25d4bb85-7096-462b-8474-85bd4d407377" containerID="bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c" exitCode=0 Apr 23 14:28:02.571416 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.571332 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" event={"ID":"25d4bb85-7096-462b-8474-85bd4d407377","Type":"ContainerDied","Data":"bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c"} Apr 23 14:28:02.571416 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.571354 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" Apr 23 14:28:02.571416 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.571366 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h" event={"ID":"25d4bb85-7096-462b-8474-85bd4d407377","Type":"ContainerDied","Data":"26fe80d7678c73e60a7cd05af41e9599574b5954ea3842c6b7b69e092f6f1152"} Apr 23 14:28:02.571416 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.571382 2582 scope.go:117] "RemoveContainer" containerID="8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04" Apr 23 14:28:02.579790 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.579770 2582 scope.go:117] "RemoveContainer" containerID="bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c" Apr 23 14:28:02.586999 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.586972 2582 scope.go:117] "RemoveContainer" containerID="6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122" Apr 23 14:28:02.594946 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.594928 2582 scope.go:117] "RemoveContainer" containerID="8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04" Apr 23 14:28:02.595222 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:28:02.595206 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04\": container with ID starting with 8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04 not found: ID does not exist" containerID="8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04" Apr 23 14:28:02.595270 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.595233 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04"} err="failed to get container status \"8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04\": rpc error: code = NotFound desc = could not find container \"8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04\": container with ID starting with 8cf129dd56efb400d8bb938cb16821170aa73e4ea0c539f459bb8cdb2a2a1f04 not found: ID does not exist" Apr 23 14:28:02.595270 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.595250 2582 scope.go:117] "RemoveContainer" containerID="bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c" Apr 23 14:28:02.595489 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:28:02.595469 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c\": container with ID starting with bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c not found: ID does not exist" containerID="bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c" Apr 23 14:28:02.595531 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.595495 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c"} err="failed to get container status \"bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c\": rpc error: code = NotFound desc = could not find container \"bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c\": container with ID starting with bc84b6b973fe14d110917e8ec62c41e5b1e002e9fdc7303b6e6595493145661c not found: ID does not exist" Apr 23 14:28:02.595531 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.595510 2582 scope.go:117] "RemoveContainer" containerID="6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122" Apr 23 14:28:02.595738 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:28:02.595721 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122\": container with ID starting with 6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122 not found: ID does not exist" containerID="6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122" Apr 23 14:28:02.595779 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.595744 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122"} err="failed to get container status \"6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122\": rpc error: code = NotFound desc = could not find container \"6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122\": container with ID starting with 6e3a89df9f813ca42278bfb2381b4a89a02bb913221a75645eeb390127683122 not found: ID does not exist" Apr 23 14:28:02.603184 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.603161 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h"] Apr 23 14:28:02.607569 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:02.607550 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-jqv9h"] Apr 23 14:28:03.576301 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:03.576255 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" event={"ID":"2654c476-4740-479e-bc60-fee229645bbb","Type":"ContainerStarted","Data":"fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d"} Apr 23 14:28:03.576301 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:03.576306 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" event={"ID":"2654c476-4740-479e-bc60-fee229645bbb","Type":"ContainerStarted","Data":"e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49"} Apr 23 14:28:03.576826 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:03.576661 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:28:03.576826 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:03.576693 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:28:03.597048 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:03.597002 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" podStartSLOduration=5.596986149 podStartE2EDuration="5.596986149s" podCreationTimestamp="2026-04-23 14:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:28:03.595878319 +0000 UTC m=+3386.583119199" watchObservedRunningTime="2026-04-23 14:28:03.596986149 +0000 UTC m=+3386.584227025" Apr 23 14:28:03.670599 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:03.670562 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d4bb85-7096-462b-8474-85bd4d407377" path="/var/lib/kubelet/pods/25d4bb85-7096-462b-8474-85bd4d407377/volumes" Apr 23 14:28:09.588999 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:09.588964 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:28:39.633104 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:39.633057 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 14:28:49.591408 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:49.591380 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:28:58.419640 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.419606 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj"] Apr 23 14:28:58.420203 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.420044 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kserve-container" containerID="cri-o://e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49" gracePeriod=30 Apr 23 14:28:58.420203 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.420068 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kube-rbac-proxy" containerID="cri-o://fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d" gracePeriod=30 Apr 23 14:28:58.516732 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.516694 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8"] Apr 23 14:28:58.517088 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.517074 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" Apr 23 14:28:58.517138 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.517092 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" Apr 23 14:28:58.517138 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.517119 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="storage-initializer" Apr 23 14:28:58.517138 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.517129 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="storage-initializer" Apr 23 14:28:58.517235 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.517141 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kube-rbac-proxy" Apr 23 14:28:58.517235 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.517151 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kube-rbac-proxy" Apr 23 14:28:58.517235 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.517224 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kserve-container" Apr 23 14:28:58.517327 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.517236 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="25d4bb85-7096-462b-8474-85bd4d407377" containerName="kube-rbac-proxy" Apr 23 14:28:58.520491 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.520474 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.522811 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.522792 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 23 14:28:58.522982 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.522837 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 23 14:28:58.528667 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.528638 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8"] Apr 23 14:28:58.598615 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.598572 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.598774 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.598628 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df87f19d-c6db-44c6-b083-f9633efc191e-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.598774 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.598677 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df87f19d-c6db-44c6-b083-f9633efc191e-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.598774 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.598715 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhn7q\" (UniqueName: \"kubernetes.io/projected/df87f19d-c6db-44c6-b083-f9633efc191e-kube-api-access-zhn7q\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.699424 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.699331 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df87f19d-c6db-44c6-b083-f9633efc191e-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.699424 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.699377 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhn7q\" (UniqueName: \"kubernetes.io/projected/df87f19d-c6db-44c6-b083-f9633efc191e-kube-api-access-zhn7q\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.699672 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.699433 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.699672 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.699449 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df87f19d-c6db-44c6-b083-f9633efc191e-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.699672 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:28:58.699593 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 23 14:28:58.699799 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:28:58.699704 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls podName:df87f19d-c6db-44c6-b083-f9633efc191e nodeName:}" failed. No retries permitted until 2026-04-23 14:28:59.199682731 +0000 UTC m=+3442.186923591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" (UID: "df87f19d-c6db-44c6-b083-f9633efc191e") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 23 14:28:58.699860 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.699844 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df87f19d-c6db-44c6-b083-f9633efc191e-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.700098 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.700081 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df87f19d-c6db-44c6-b083-f9633efc191e-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.710004 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.709980 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhn7q\" (UniqueName: \"kubernetes.io/projected/df87f19d-c6db-44c6-b083-f9633efc191e-kube-api-access-zhn7q\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:58.742505 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.742468 2582 generic.go:358] "Generic (PLEG): container finished" podID="2654c476-4740-479e-bc60-fee229645bbb" containerID="fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d" exitCode=2 Apr 23 14:28:58.742664 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:58.742543 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" event={"ID":"2654c476-4740-479e-bc60-fee229645bbb","Type":"ContainerDied","Data":"fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d"} Apr 23 14:28:59.202230 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:59.202195 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:59.204742 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:59.204717 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:59.432408 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:59.432360 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:28:59.559392 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:59.559357 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8"] Apr 23 14:28:59.559979 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:28:59.559949 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf87f19d_c6db_44c6_b083_f9633efc191e.slice/crio-85c8557455c8abeb98ce946a836a938024e2f2a9d62606cea1ae7693fa8c9cd6 WatchSource:0}: Error finding container 85c8557455c8abeb98ce946a836a938024e2f2a9d62606cea1ae7693fa8c9cd6: Status 404 returned error can't find the container with id 85c8557455c8abeb98ce946a836a938024e2f2a9d62606cea1ae7693fa8c9cd6 Apr 23 14:28:59.583774 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:59.583736 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 23 14:28:59.746461 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:59.746370 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" event={"ID":"df87f19d-c6db-44c6-b083-f9633efc191e","Type":"ContainerStarted","Data":"7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2"} Apr 23 14:28:59.746461 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:28:59.746412 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" event={"ID":"df87f19d-c6db-44c6-b083-f9633efc191e","Type":"ContainerStarted","Data":"85c8557455c8abeb98ce946a836a938024e2f2a9d62606cea1ae7693fa8c9cd6"} Apr 23 14:29:00.631117 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:00.631074 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.48:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 23 14:29:03.758743 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:03.758704 2582 generic.go:358] "Generic (PLEG): container finished" podID="df87f19d-c6db-44c6-b083-f9633efc191e" containerID="7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2" exitCode=0 Apr 23 14:29:03.759179 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:03.758778 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" event={"ID":"df87f19d-c6db-44c6-b083-f9633efc191e","Type":"ContainerDied","Data":"7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2"} Apr 23 14:29:04.583537 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:04.583489 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 23 14:29:04.764079 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:04.764044 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" event={"ID":"df87f19d-c6db-44c6-b083-f9633efc191e","Type":"ContainerStarted","Data":"01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e"} Apr 23 14:29:04.764079 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:04.764081 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" event={"ID":"df87f19d-c6db-44c6-b083-f9633efc191e","Type":"ContainerStarted","Data":"b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f"} Apr 23 14:29:04.764557 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:04.764411 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:29:04.764557 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:04.764529 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:29:04.765902 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:04.765876 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:29:04.783786 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:04.783738 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podStartSLOduration=6.783724262 podStartE2EDuration="6.783724262s" podCreationTimestamp="2026-04-23 14:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:29:04.783045813 +0000 UTC m=+3447.770286691" watchObservedRunningTime="2026-04-23 14:29:04.783724262 +0000 UTC m=+3447.770965143" Apr 23 14:29:05.768372 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:05.768327 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:29:06.094703 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.094678 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:29:06.158700 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.158660 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h466\" (UniqueName: \"kubernetes.io/projected/2654c476-4740-479e-bc60-fee229645bbb-kube-api-access-5h466\") pod \"2654c476-4740-479e-bc60-fee229645bbb\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " Apr 23 14:29:06.158879 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.158711 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2654c476-4740-479e-bc60-fee229645bbb-proxy-tls\") pod \"2654c476-4740-479e-bc60-fee229645bbb\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " Apr 23 14:29:06.158879 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.158755 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2654c476-4740-479e-bc60-fee229645bbb-kserve-provision-location\") pod \"2654c476-4740-479e-bc60-fee229645bbb\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " Apr 23 14:29:06.158879 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.158786 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2654c476-4740-479e-bc60-fee229645bbb-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"2654c476-4740-479e-bc60-fee229645bbb\" (UID: \"2654c476-4740-479e-bc60-fee229645bbb\") " Apr 23 14:29:06.159168 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.159141 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2654c476-4740-479e-bc60-fee229645bbb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2654c476-4740-479e-bc60-fee229645bbb" (UID: "2654c476-4740-479e-bc60-fee229645bbb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:29:06.159238 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.159189 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2654c476-4740-479e-bc60-fee229645bbb-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "2654c476-4740-479e-bc60-fee229645bbb" (UID: "2654c476-4740-479e-bc60-fee229645bbb"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:29:06.161024 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.161001 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2654c476-4740-479e-bc60-fee229645bbb-kube-api-access-5h466" (OuterVolumeSpecName: "kube-api-access-5h466") pod "2654c476-4740-479e-bc60-fee229645bbb" (UID: "2654c476-4740-479e-bc60-fee229645bbb"). InnerVolumeSpecName "kube-api-access-5h466". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:29:06.161111 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.161094 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2654c476-4740-479e-bc60-fee229645bbb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2654c476-4740-479e-bc60-fee229645bbb" (UID: "2654c476-4740-479e-bc60-fee229645bbb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:29:06.259307 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.259270 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2654c476-4740-479e-bc60-fee229645bbb-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:29:06.259307 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.259301 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2654c476-4740-479e-bc60-fee229645bbb-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:29:06.259307 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.259312 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5h466\" (UniqueName: \"kubernetes.io/projected/2654c476-4740-479e-bc60-fee229645bbb-kube-api-access-5h466\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:29:06.259533 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.259321 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2654c476-4740-479e-bc60-fee229645bbb-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:29:06.772454 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.772410 2582 generic.go:358] "Generic (PLEG): container finished" podID="2654c476-4740-479e-bc60-fee229645bbb" containerID="e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49" exitCode=0 Apr 23 14:29:06.772900 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.772484 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" event={"ID":"2654c476-4740-479e-bc60-fee229645bbb","Type":"ContainerDied","Data":"e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49"} Apr 23 14:29:06.772900 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.772518 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" event={"ID":"2654c476-4740-479e-bc60-fee229645bbb","Type":"ContainerDied","Data":"299c11fecad453a883dd030cb40b091646f5660d0995341fd99625c72c2e5dd6"} Apr 23 14:29:06.772900 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.772533 2582 scope.go:117] "RemoveContainer" containerID="fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d" Apr 23 14:29:06.772900 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.772494 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj" Apr 23 14:29:06.773096 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.772986 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:29:06.780907 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.780878 2582 scope.go:117] "RemoveContainer" containerID="e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49" Apr 23 14:29:06.788034 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.788016 2582 scope.go:117] "RemoveContainer" containerID="44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011" Apr 23 14:29:06.794704 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.794679 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj"] Apr 23 14:29:06.795470 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.795452 2582 scope.go:117] "RemoveContainer" containerID="fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d" Apr 23 14:29:06.795763 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:29:06.795738 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d\": container with ID starting with fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d not found: ID does not exist" containerID="fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d" Apr 23 14:29:06.795823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.795775 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d"} err="failed to get container status \"fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d\": rpc error: code = NotFound desc = could not find container \"fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d\": container with ID starting with fad39fba3176867eaec41964d885ad37afd7992890334306986eb23b38e1044d not found: ID does not exist" Apr 23 14:29:06.795823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.795795 2582 scope.go:117] "RemoveContainer" containerID="e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49" Apr 23 14:29:06.796098 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:29:06.796078 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49\": container with ID starting with e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49 not found: ID does not exist" containerID="e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49" Apr 23 14:29:06.796164 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.796107 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49"} err="failed to get container status \"e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49\": rpc error: code = NotFound desc = could not find container \"e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49\": container with ID starting with e2580c3ef3dff908541584573bb8d2504e74d7e034b4148c861b389f06785a49 not found: ID does not exist" Apr 23 14:29:06.796164 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.796124 2582 scope.go:117] "RemoveContainer" containerID="44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011" Apr 23 14:29:06.796390 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:29:06.796375 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011\": container with ID starting with 44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011 not found: ID does not exist" containerID="44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011" Apr 23 14:29:06.796431 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.796396 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011"} err="failed to get container status \"44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011\": rpc error: code = NotFound desc = could not find container \"44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011\": container with ID starting with 44477fb9c3f42f9f2b19ccfcc657cee5a29cd8798ad9e460beec35096f2c5011 not found: ID does not exist" Apr 23 14:29:06.799501 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:06.799482 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-vlzsj"] Apr 23 14:29:07.669767 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:07.669732 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2654c476-4740-479e-bc60-fee229645bbb" path="/var/lib/kubelet/pods/2654c476-4740-479e-bc60-fee229645bbb/volumes" Apr 23 14:29:11.777079 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:11.777051 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:29:11.777632 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:11.777607 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:29:21.777623 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:21.777570 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:29:31.777700 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:31.777661 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:29:41.778307 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:41.778259 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:29:51.778000 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:29:51.777890 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:30:01.778394 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:01.778352 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:30:11.778509 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:11.778478 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:30:18.615657 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:18.615615 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8"] Apr 23 14:30:18.616091 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:18.615942 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" containerID="cri-o://b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f" gracePeriod=30 Apr 23 14:30:18.616091 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:18.615992 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kube-rbac-proxy" containerID="cri-o://01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e" gracePeriod=30 Apr 23 14:30:18.967260 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:18.967225 2582 generic.go:358] "Generic (PLEG): container finished" podID="df87f19d-c6db-44c6-b083-f9633efc191e" containerID="01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e" exitCode=2 Apr 23 14:30:18.967433 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:18.967302 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" event={"ID":"df87f19d-c6db-44c6-b083-f9633efc191e","Type":"ContainerDied","Data":"01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e"} Apr 23 14:30:21.773242 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:21.773191 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 23 14:30:21.778567 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:21.778537 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 14:30:22.455744 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.455720 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:30:22.545665 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.545569 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls\") pod \"df87f19d-c6db-44c6-b083-f9633efc191e\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " Apr 23 14:30:22.545665 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.545623 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df87f19d-c6db-44c6-b083-f9633efc191e-kserve-provision-location\") pod \"df87f19d-c6db-44c6-b083-f9633efc191e\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " Apr 23 14:30:22.545665 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.545664 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhn7q\" (UniqueName: \"kubernetes.io/projected/df87f19d-c6db-44c6-b083-f9633efc191e-kube-api-access-zhn7q\") pod \"df87f19d-c6db-44c6-b083-f9633efc191e\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " Apr 23 14:30:22.545947 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.545695 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df87f19d-c6db-44c6-b083-f9633efc191e-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"df87f19d-c6db-44c6-b083-f9633efc191e\" (UID: \"df87f19d-c6db-44c6-b083-f9633efc191e\") " Apr 23 14:30:22.546073 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.546030 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df87f19d-c6db-44c6-b083-f9633efc191e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df87f19d-c6db-44c6-b083-f9633efc191e" (UID: "df87f19d-c6db-44c6-b083-f9633efc191e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:30:22.546196 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.546092 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df87f19d-c6db-44c6-b083-f9633efc191e-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "df87f19d-c6db-44c6-b083-f9633efc191e" (UID: "df87f19d-c6db-44c6-b083-f9633efc191e"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:30:22.547858 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.547840 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "df87f19d-c6db-44c6-b083-f9633efc191e" (UID: "df87f19d-c6db-44c6-b083-f9633efc191e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:30:22.547955 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.547886 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df87f19d-c6db-44c6-b083-f9633efc191e-kube-api-access-zhn7q" (OuterVolumeSpecName: "kube-api-access-zhn7q") pod "df87f19d-c6db-44c6-b083-f9633efc191e" (UID: "df87f19d-c6db-44c6-b083-f9633efc191e"). InnerVolumeSpecName "kube-api-access-zhn7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:30:22.646763 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.646703 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zhn7q\" (UniqueName: \"kubernetes.io/projected/df87f19d-c6db-44c6-b083-f9633efc191e-kube-api-access-zhn7q\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:30:22.646763 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.646751 2582 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df87f19d-c6db-44c6-b083-f9633efc191e-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:30:22.646763 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.646766 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df87f19d-c6db-44c6-b083-f9633efc191e-proxy-tls\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:30:22.646763 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.646777 2582 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df87f19d-c6db-44c6-b083-f9633efc191e-kserve-provision-location\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:30:22.979316 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.979279 2582 generic.go:358] "Generic (PLEG): container finished" podID="df87f19d-c6db-44c6-b083-f9633efc191e" containerID="b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f" exitCode=0 Apr 23 14:30:22.979728 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.979321 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" event={"ID":"df87f19d-c6db-44c6-b083-f9633efc191e","Type":"ContainerDied","Data":"b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f"} Apr 23 14:30:22.979728 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.979348 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" event={"ID":"df87f19d-c6db-44c6-b083-f9633efc191e","Type":"ContainerDied","Data":"85c8557455c8abeb98ce946a836a938024e2f2a9d62606cea1ae7693fa8c9cd6"} Apr 23 14:30:22.979728 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.979359 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8" Apr 23 14:30:22.979728 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.979365 2582 scope.go:117] "RemoveContainer" containerID="01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e" Apr 23 14:30:22.987654 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.987631 2582 scope.go:117] "RemoveContainer" containerID="b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f" Apr 23 14:30:22.996796 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:22.996776 2582 scope.go:117] "RemoveContainer" containerID="7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2" Apr 23 14:30:23.000868 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.000846 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8"] Apr 23 14:30:23.004710 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.004689 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-qc8j8"] Apr 23 14:30:23.004784 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.004731 2582 scope.go:117] "RemoveContainer" containerID="01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e" Apr 23 14:30:23.005113 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:30:23.005083 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e\": container with ID starting with 01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e not found: ID does not exist" containerID="01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e" Apr 23 14:30:23.005188 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.005122 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e"} err="failed to get container status \"01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e\": rpc error: code = NotFound desc = could not find container \"01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e\": container with ID starting with 01fd225c1bed9943a20c1047e5d27203307b25c1fe3f6ea9c39d3d01d72d145e not found: ID does not exist" Apr 23 14:30:23.005188 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.005151 2582 scope.go:117] "RemoveContainer" containerID="b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f" Apr 23 14:30:23.005377 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:30:23.005363 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f\": container with ID starting with b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f not found: ID does not exist" containerID="b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f" Apr 23 14:30:23.005423 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.005380 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f"} err="failed to get container status \"b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f\": rpc error: code = NotFound desc = could not find container \"b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f\": container with ID starting with b0e7ae5bbf03380de1b082e038c2d6692454628b7c14fa493f17378fd4bbdf8f not found: ID does not exist" Apr 23 14:30:23.005423 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.005392 2582 scope.go:117] "RemoveContainer" containerID="7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2" Apr 23 14:30:23.005619 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:30:23.005602 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2\": container with ID starting with 7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2 not found: ID does not exist" containerID="7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2" Apr 23 14:30:23.005662 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.005625 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2"} err="failed to get container status \"7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2\": rpc error: code = NotFound desc = could not find container \"7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2\": container with ID starting with 7db3fe1e8120873fbf4b11407299cbaa2df76520c00d4a0614f59b12731971e2 not found: ID does not exist" Apr 23 14:30:23.669250 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:30:23.669219 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" path="/var/lib/kubelet/pods/df87f19d-c6db-44c6-b083-f9633efc191e/volumes" Apr 23 14:32:44.458461 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:32:44.458431 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:32:44.459747 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:32:44.459725 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:36:16.523562 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523480 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-spgvs/must-gather-ml7lk"] Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523773 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523787 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523800 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="storage-initializer" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523806 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="storage-initializer" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523815 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kube-rbac-proxy" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523821 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kube-rbac-proxy" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523829 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kube-rbac-proxy" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523834 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kube-rbac-proxy" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523842 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kserve-container" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523846 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kserve-container" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523856 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="storage-initializer" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523861 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="storage-initializer" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523901 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kserve-container" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523907 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kserve-container" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523933 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="2654c476-4740-479e-bc60-fee229645bbb" containerName="kube-rbac-proxy" Apr 23 14:36:16.524081 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.523945 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="df87f19d-c6db-44c6-b083-f9633efc191e" containerName="kube-rbac-proxy" Apr 23 14:36:16.526819 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.526796 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:16.529388 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.529365 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-spgvs\"/\"default-dockercfg-m8l8f\"" Apr 23 14:36:16.529488 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.529415 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-spgvs\"/\"openshift-service-ca.crt\"" Apr 23 14:36:16.529488 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.529417 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-spgvs\"/\"kube-root-ca.crt\"" Apr 23 14:36:16.537663 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.537639 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-spgvs/must-gather-ml7lk"] Apr 23 14:36:16.701383 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.701350 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/484d21a2-8679-4e93-aa4e-805b0e529ef6-must-gather-output\") pod \"must-gather-ml7lk\" (UID: \"484d21a2-8679-4e93-aa4e-805b0e529ef6\") " pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:16.701561 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.701389 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfqp2\" (UniqueName: \"kubernetes.io/projected/484d21a2-8679-4e93-aa4e-805b0e529ef6-kube-api-access-sfqp2\") pod \"must-gather-ml7lk\" (UID: \"484d21a2-8679-4e93-aa4e-805b0e529ef6\") " pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:16.802755 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.802673 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/484d21a2-8679-4e93-aa4e-805b0e529ef6-must-gather-output\") pod \"must-gather-ml7lk\" (UID: \"484d21a2-8679-4e93-aa4e-805b0e529ef6\") " pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:16.802755 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.802724 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfqp2\" (UniqueName: \"kubernetes.io/projected/484d21a2-8679-4e93-aa4e-805b0e529ef6-kube-api-access-sfqp2\") pod \"must-gather-ml7lk\" (UID: \"484d21a2-8679-4e93-aa4e-805b0e529ef6\") " pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:16.803061 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.803042 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/484d21a2-8679-4e93-aa4e-805b0e529ef6-must-gather-output\") pod \"must-gather-ml7lk\" (UID: \"484d21a2-8679-4e93-aa4e-805b0e529ef6\") " pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:16.811886 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.811867 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfqp2\" (UniqueName: \"kubernetes.io/projected/484d21a2-8679-4e93-aa4e-805b0e529ef6-kube-api-access-sfqp2\") pod \"must-gather-ml7lk\" (UID: \"484d21a2-8679-4e93-aa4e-805b0e529ef6\") " pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:16.835351 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.835318 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:16.992653 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.992550 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-spgvs/must-gather-ml7lk"] Apr 23 14:36:16.995346 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:36:16.995322 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod484d21a2_8679_4e93_aa4e_805b0e529ef6.slice/crio-c0ff90bc1d166cafbfbec2ee9451df249f1f4faf75e4a574f971514564c986d6 WatchSource:0}: Error finding container c0ff90bc1d166cafbfbec2ee9451df249f1f4faf75e4a574f971514564c986d6: Status 404 returned error can't find the container with id c0ff90bc1d166cafbfbec2ee9451df249f1f4faf75e4a574f971514564c986d6 Apr 23 14:36:16.996935 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:16.996903 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:36:17.902250 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:17.902213 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-spgvs/must-gather-ml7lk" event={"ID":"484d21a2-8679-4e93-aa4e-805b0e529ef6","Type":"ContainerStarted","Data":"c0ff90bc1d166cafbfbec2ee9451df249f1f4faf75e4a574f971514564c986d6"} Apr 23 14:36:21.917251 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:21.917204 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-spgvs/must-gather-ml7lk" event={"ID":"484d21a2-8679-4e93-aa4e-805b0e529ef6","Type":"ContainerStarted","Data":"54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3"} Apr 23 14:36:21.917251 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:21.917242 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-spgvs/must-gather-ml7lk" event={"ID":"484d21a2-8679-4e93-aa4e-805b0e529ef6","Type":"ContainerStarted","Data":"d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c"} Apr 23 14:36:21.934462 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:21.934401 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-spgvs/must-gather-ml7lk" podStartSLOduration=1.355524891 podStartE2EDuration="5.934384126s" podCreationTimestamp="2026-04-23 14:36:16 +0000 UTC" firstStartedPulling="2026-04-23 14:36:16.99711166 +0000 UTC m=+3879.984352533" lastFinishedPulling="2026-04-23 14:36:21.575970911 +0000 UTC m=+3884.563211768" observedRunningTime="2026-04-23 14:36:21.932478168 +0000 UTC m=+3884.919719061" watchObservedRunningTime="2026-04-23 14:36:21.934384126 +0000 UTC m=+3884.921625007" Apr 23 14:36:43.993301 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:43.993264 2582 generic.go:358] "Generic (PLEG): container finished" podID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerID="d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c" exitCode=0 Apr 23 14:36:43.993739 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:43.993299 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-spgvs/must-gather-ml7lk" event={"ID":"484d21a2-8679-4e93-aa4e-805b0e529ef6","Type":"ContainerDied","Data":"d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c"} Apr 23 14:36:43.993739 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:43.993622 2582 scope.go:117] "RemoveContainer" containerID="d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c" Apr 23 14:36:44.499714 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:44.499669 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-spgvs_must-gather-ml7lk_484d21a2-8679-4e93-aa4e-805b0e529ef6/gather/0.log" Apr 23 14:36:47.907355 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:47.907318 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-msdj4_97134a37-e40a-4587-b02d-795b8a714cc0/global-pull-secret-syncer/0.log" Apr 23 14:36:48.099778 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:48.099746 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-r2klv_66ef4cab-5277-4b3a-a87a-8cc03965a437/konnectivity-agent/0.log" Apr 23 14:36:48.143764 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:48.143728 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-40.ec2.internal_561727a35e04946faef12be860e97824/haproxy/0.log" Apr 23 14:36:49.971582 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:49.971542 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-spgvs/must-gather-ml7lk"] Apr 23 14:36:49.972094 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:49.971769 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-spgvs/must-gather-ml7lk" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerName="copy" containerID="cri-o://54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3" gracePeriod=2 Apr 23 14:36:49.976212 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:49.975599 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-spgvs/must-gather-ml7lk"] Apr 23 14:36:50.204107 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.204076 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-spgvs_must-gather-ml7lk_484d21a2-8679-4e93-aa4e-805b0e529ef6/copy/0.log" Apr 23 14:36:50.204508 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.204490 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:50.207154 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.207126 2582 status_manager.go:895] "Failed to get status for pod" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" pod="openshift-must-gather-spgvs/must-gather-ml7lk" err="pods \"must-gather-ml7lk\" is forbidden: User \"system:node:ip-10-0-139-40.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-spgvs\": no relationship found between node 'ip-10-0-139-40.ec2.internal' and this object" Apr 23 14:36:50.271670 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.271587 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/484d21a2-8679-4e93-aa4e-805b0e529ef6-must-gather-output\") pod \"484d21a2-8679-4e93-aa4e-805b0e529ef6\" (UID: \"484d21a2-8679-4e93-aa4e-805b0e529ef6\") " Apr 23 14:36:50.271670 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.271663 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfqp2\" (UniqueName: \"kubernetes.io/projected/484d21a2-8679-4e93-aa4e-805b0e529ef6-kube-api-access-sfqp2\") pod \"484d21a2-8679-4e93-aa4e-805b0e529ef6\" (UID: \"484d21a2-8679-4e93-aa4e-805b0e529ef6\") " Apr 23 14:36:50.273155 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.273130 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484d21a2-8679-4e93-aa4e-805b0e529ef6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "484d21a2-8679-4e93-aa4e-805b0e529ef6" (UID: "484d21a2-8679-4e93-aa4e-805b0e529ef6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:36:50.274023 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.273995 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484d21a2-8679-4e93-aa4e-805b0e529ef6-kube-api-access-sfqp2" (OuterVolumeSpecName: "kube-api-access-sfqp2") pod "484d21a2-8679-4e93-aa4e-805b0e529ef6" (UID: "484d21a2-8679-4e93-aa4e-805b0e529ef6"). InnerVolumeSpecName "kube-api-access-sfqp2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:36:50.372543 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.372512 2582 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/484d21a2-8679-4e93-aa4e-805b0e529ef6-must-gather-output\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:36:50.372543 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:50.372540 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfqp2\" (UniqueName: \"kubernetes.io/projected/484d21a2-8679-4e93-aa4e-805b0e529ef6-kube-api-access-sfqp2\") on node \"ip-10-0-139-40.ec2.internal\" DevicePath \"\"" Apr 23 14:36:51.013515 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.013480 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-spgvs_must-gather-ml7lk_484d21a2-8679-4e93-aa4e-805b0e529ef6/copy/0.log" Apr 23 14:36:51.013968 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.013853 2582 generic.go:358] "Generic (PLEG): container finished" podID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerID="54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3" exitCode=143 Apr 23 14:36:51.013968 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.013908 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-spgvs/must-gather-ml7lk" Apr 23 14:36:51.013968 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.013954 2582 scope.go:117] "RemoveContainer" containerID="54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3" Apr 23 14:36:51.016563 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.016539 2582 status_manager.go:895] "Failed to get status for pod" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" pod="openshift-must-gather-spgvs/must-gather-ml7lk" err="pods \"must-gather-ml7lk\" is forbidden: User \"system:node:ip-10-0-139-40.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-spgvs\": no relationship found between node 'ip-10-0-139-40.ec2.internal' and this object" Apr 23 14:36:51.021624 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.021607 2582 scope.go:117] "RemoveContainer" containerID="d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c" Apr 23 14:36:51.024163 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.024133 2582 status_manager.go:895] "Failed to get status for pod" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" pod="openshift-must-gather-spgvs/must-gather-ml7lk" err="pods \"must-gather-ml7lk\" is forbidden: User \"system:node:ip-10-0-139-40.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-spgvs\": no relationship found between node 'ip-10-0-139-40.ec2.internal' and this object" Apr 23 14:36:51.033512 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.033492 2582 scope.go:117] "RemoveContainer" containerID="54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3" Apr 23 14:36:51.033742 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:36:51.033722 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3\": container with ID starting with 54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3 not found: ID does not exist" containerID="54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3" Apr 23 14:36:51.033788 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.033751 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3"} err="failed to get container status \"54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3\": rpc error: code = NotFound desc = could not find container \"54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3\": container with ID starting with 54f5b31b3ad2d5bde1fe16d4067af3396a422b4e09fdcf683d099d34e9ff5af3 not found: ID does not exist" Apr 23 14:36:51.033788 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.033771 2582 scope.go:117] "RemoveContainer" containerID="d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c" Apr 23 14:36:51.034038 ip-10-0-139-40 kubenswrapper[2582]: E0423 14:36:51.034016 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c\": container with ID starting with d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c not found: ID does not exist" containerID="d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c" Apr 23 14:36:51.034106 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.034047 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c"} err="failed to get container status \"d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c\": rpc error: code = NotFound desc = could not find container \"d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c\": container with ID starting with d28fa5cc924f72e80917450ff9b5bb896e2ba71e556ead62ac1e332dfd4f3c0c not found: ID does not exist" Apr 23 14:36:51.670434 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.670399 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" path="/var/lib/kubelet/pods/484d21a2-8679-4e93-aa4e-805b0e529ef6/volumes" Apr 23 14:36:51.965089 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.965005 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-nvt2v_f483be1c-073c-402c-b195-f1bfc3325dea/monitoring-plugin/0.log" Apr 23 14:36:51.997878 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:51.997851 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-llhfg_9476c852-09eb-4fdf-9fbc-50fc81e92780/node-exporter/0.log" Apr 23 14:36:52.021493 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:52.021466 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-llhfg_9476c852-09eb-4fdf-9fbc-50fc81e92780/kube-rbac-proxy/0.log" Apr 23 14:36:52.040949 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:52.040910 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-llhfg_9476c852-09eb-4fdf-9fbc-50fc81e92780/init-textfile/0.log" Apr 23 14:36:52.501158 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:52.501130 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-zt747_858931d3-ce64-4e42-adf5-a28423e3abd3/prometheus-operator-admission-webhook/0.log" Apr 23 14:36:54.709939 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.709895 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8c7b6b7b6-7gbmt_3e36b65e-d6eb-4549-a16c-34dfa89283d2/console/0.log" Apr 23 14:36:54.750753 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.750724 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-dsbgl_bd3f2a9d-bf00-49e1-a88d-4a9fad71b3a4/download-server/0.log" Apr 23 14:36:54.951714 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.951682 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69"] Apr 23 14:36:54.951990 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.951978 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerName="gather" Apr 23 14:36:54.952056 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.951992 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerName="gather" Apr 23 14:36:54.952056 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.952009 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerName="copy" Apr 23 14:36:54.952056 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.952014 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerName="copy" Apr 23 14:36:54.952164 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.952057 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerName="gather" Apr 23 14:36:54.952164 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.952066 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="484d21a2-8679-4e93-aa4e-805b0e529ef6" containerName="copy" Apr 23 14:36:54.956750 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.956730 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:54.964059 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.964036 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-66tjc\"/\"openshift-service-ca.crt\"" Apr 23 14:36:54.964227 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.964038 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-66tjc\"/\"default-dockercfg-6hdlv\"" Apr 23 14:36:54.964312 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.964257 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-66tjc\"/\"kube-root-ca.crt\"" Apr 23 14:36:54.965031 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:54.965007 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69"] Apr 23 14:36:55.002444 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.002421 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-podres\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.002575 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.002452 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-proc\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.002575 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.002483 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mb2v\" (UniqueName: \"kubernetes.io/projected/43a21114-2129-4d10-bcce-0b6111a92e28-kube-api-access-7mb2v\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.002575 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.002545 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-lib-modules\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.002676 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.002589 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-sys\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103461 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103422 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-lib-modules\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103640 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103479 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-sys\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103640 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103519 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-podres\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103640 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103542 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-proc\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103640 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103600 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-lib-modules\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103640 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103596 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mb2v\" (UniqueName: \"kubernetes.io/projected/43a21114-2129-4d10-bcce-0b6111a92e28-kube-api-access-7mb2v\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103658 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-podres\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103665 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-proc\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.103823 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.103629 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43a21114-2129-4d10-bcce-0b6111a92e28-sys\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.113467 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.113442 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mb2v\" (UniqueName: \"kubernetes.io/projected/43a21114-2129-4d10-bcce-0b6111a92e28-kube-api-access-7mb2v\") pod \"perf-node-gather-daemonset-pmv69\" (UID: \"43a21114-2129-4d10-bcce-0b6111a92e28\") " pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.267535 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.267436 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:55.404330 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.404295 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69"] Apr 23 14:36:55.407233 ip-10-0-139-40 kubenswrapper[2582]: W0423 14:36:55.407203 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod43a21114_2129_4d10_bcce_0b6111a92e28.slice/crio-a41356e5755c6eb3d898885af73dc80e29e73af7fc60aebcb4a78ea6ce293941 WatchSource:0}: Error finding container a41356e5755c6eb3d898885af73dc80e29e73af7fc60aebcb4a78ea6ce293941: Status 404 returned error can't find the container with id a41356e5755c6eb3d898885af73dc80e29e73af7fc60aebcb4a78ea6ce293941 Apr 23 14:36:55.906132 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.906103 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8jlgh_3ad71ca4-3e9d-4f24-866e-d3a822733344/dns/0.log" Apr 23 14:36:55.924612 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:55.924588 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8jlgh_3ad71ca4-3e9d-4f24-866e-d3a822733344/kube-rbac-proxy/0.log" Apr 23 14:36:56.028657 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:56.028623 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" event={"ID":"43a21114-2129-4d10-bcce-0b6111a92e28","Type":"ContainerStarted","Data":"5d3ab18d5a83814a83e54740e9c210a8249290326986a381af316bb690c97c93"} Apr 23 14:36:56.028657 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:56.028660 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" event={"ID":"43a21114-2129-4d10-bcce-0b6111a92e28","Type":"ContainerStarted","Data":"a41356e5755c6eb3d898885af73dc80e29e73af7fc60aebcb4a78ea6ce293941"} Apr 23 14:36:56.028874 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:56.028737 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:36:56.044013 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:56.043966 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" podStartSLOduration=2.043951604 podStartE2EDuration="2.043951604s" podCreationTimestamp="2026-04-23 14:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:36:56.043779481 +0000 UTC m=+3919.031020361" watchObservedRunningTime="2026-04-23 14:36:56.043951604 +0000 UTC m=+3919.031192482" Apr 23 14:36:56.064497 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:56.064472 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9tk4c_25990e8e-c5b7-435c-8980-d1c4bd84116d/dns-node-resolver/0.log" Apr 23 14:36:56.544632 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:56.544604 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kz76p_b96ebd94-9b73-4821-8946-4734e772932d/node-ca/0.log" Apr 23 14:36:57.623132 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:57.623099 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qx8mp_71016223-1429-40b3-94b8-0fa57c7f235a/serve-healthcheck-canary/0.log" Apr 23 14:36:58.104789 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:58.104757 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zp2vr_ec60ed1a-66b8-4ec4-ab1d-101ecce247a4/kube-rbac-proxy/0.log" Apr 23 14:36:58.125467 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:58.125407 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zp2vr_ec60ed1a-66b8-4ec4-ab1d-101ecce247a4/exporter/0.log" Apr 23 14:36:58.147790 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:36:58.147765 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zp2vr_ec60ed1a-66b8-4ec4-ab1d-101ecce247a4/extractor/0.log" Apr 23 14:37:00.776014 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:00.775988 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-hsvj8_1148e33c-d2ca-4cdd-b81a-604ba117dd37/seaweedfs/0.log" Apr 23 14:37:00.821507 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:00.821479 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-xhn2h_e40efd34-7dd8-4a14-985b-f10d4925bd6e/seaweedfs-tls-serving/0.log" Apr 23 14:37:02.040609 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:02.040581 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-66tjc/perf-node-gather-daemonset-pmv69" Apr 23 14:37:06.592138 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.592103 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdtkd_431c9349-7f7f-4d46-8b03-2517188be63c/kube-multus-additional-cni-plugins/0.log" Apr 23 14:37:06.626828 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.626804 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdtkd_431c9349-7f7f-4d46-8b03-2517188be63c/egress-router-binary-copy/0.log" Apr 23 14:37:06.718871 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.718841 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdtkd_431c9349-7f7f-4d46-8b03-2517188be63c/cni-plugins/0.log" Apr 23 14:37:06.740153 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.740077 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdtkd_431c9349-7f7f-4d46-8b03-2517188be63c/bond-cni-plugin/0.log" Apr 23 14:37:06.764582 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.764561 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdtkd_431c9349-7f7f-4d46-8b03-2517188be63c/routeoverride-cni/0.log" Apr 23 14:37:06.786720 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.786703 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdtkd_431c9349-7f7f-4d46-8b03-2517188be63c/whereabouts-cni-bincopy/0.log" Apr 23 14:37:06.808000 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.807982 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdtkd_431c9349-7f7f-4d46-8b03-2517188be63c/whereabouts-cni/0.log" Apr 23 14:37:06.894496 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.894470 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dm595_1f7b9e0a-9c75-402a-9f74-7dc83741af82/kube-multus/0.log" Apr 23 14:37:06.917339 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.917315 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8vwqm_bf879d65-39bb-4d9a-aa57-7d499026e167/network-metrics-daemon/0.log" Apr 23 14:37:06.937945 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:06.937912 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8vwqm_bf879d65-39bb-4d9a-aa57-7d499026e167/kube-rbac-proxy/0.log" Apr 23 14:37:08.115151 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.115100 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-controller/0.log" Apr 23 14:37:08.132713 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.132686 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/0.log" Apr 23 14:37:08.153198 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.153167 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovn-acl-logging/1.log" Apr 23 14:37:08.168670 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.168643 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/kube-rbac-proxy-node/0.log" Apr 23 14:37:08.187455 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.187419 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 14:37:08.208742 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.208716 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/northd/0.log" Apr 23 14:37:08.231022 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.230988 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/nbdb/0.log" Apr 23 14:37:08.254205 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.254184 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/sbdb/0.log" Apr 23 14:37:08.370325 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:08.370245 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cx2lr_30d85b9d-16ae-419a-8534-8b142607909e/ovnkube-controller/0.log" Apr 23 14:37:09.656683 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:09.656654 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dnqkh_70f0fbee-2214-4d11-8550-54879ecb58b1/network-check-target-container/0.log" Apr 23 14:37:10.623473 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:10.623439 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rgtnc_1a0b8e16-337d-4350-8a10-754ae14c0ea7/iptables-alerter/0.log" Apr 23 14:37:11.247361 ip-10-0-139-40 kubenswrapper[2582]: I0423 14:37:11.247282 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9ss9q_6d70fbe1-9754-48a1-82b6-1656723cda25/tuned/0.log"