Apr 16 19:53:56.332389 ip-10-0-129-34 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:56.823125 ip-10-0-129-34 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:56.823125 ip-10-0-129-34 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:56.823125 ip-10-0-129-34 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:56.823125 ip-10-0-129-34 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:56.823125 ip-10-0-129-34 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:56.825003 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.824878 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:56.829691 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829673 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.829691 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829692 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829696 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829699 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829703 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829705 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829708 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829711 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829715 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829717 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829720 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829722 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829725 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829728 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829730 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829734 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829737 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829740 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829744 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829748 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829751 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.829784 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829754 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829757 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829760 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829762 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829765 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829768 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829770 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829773 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829775 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829778 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829781 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829783 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829786 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829789 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829792 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829795 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829798 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829800 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829803 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829805 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.830283 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829808 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829812 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829815 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829819 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829822 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829825 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829827 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829829 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829832 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829834 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829837 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829840 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829843 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829845 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829848 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829852 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829854 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829857 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829859 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829862 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.830812 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829864 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829867 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829869 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829872 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829875 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829878 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829881 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829884 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829887 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829890 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829893 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829895 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829898 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829900 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829903 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829905 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829909 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829911 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829914 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829917 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.831316 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829919 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829922 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829925 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829928 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.829930 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830363 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830372 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830376 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830380 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830383 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830385 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830388 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830391 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830393 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830396 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830399 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830401 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830405 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830408 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.831798 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830410 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830413 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830415 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830418 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830420 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830423 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830425 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830428 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830431 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830435 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830437 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830440 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830443 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830445 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830448 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830451 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830453 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830456 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830458 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830461 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.832329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830464 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830467 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830469 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830471 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830474 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830476 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830479 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830481 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830484 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830486 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830489 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830491 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830494 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830496 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830499 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830501 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830504 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830507 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830509 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830512 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.832828 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830514 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830517 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830521 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830523 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830526 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830528 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830531 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830533 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830536 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830539 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830541 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830543 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830546 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830549 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830551 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830554 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830556 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830559 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830561 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830564 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.833376 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830567 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830570 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830572 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830575 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830579 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830581 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830584 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830587 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830589 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830592 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830594 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.830597 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831827 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831844 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831850 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831855 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831859 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831862 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831871 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831875 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831879 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:56.833859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831882 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831886 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831890 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831893 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831896 2570 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831900 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831903 2570 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831906 2570 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831909 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831912 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831917 2570 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831921 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831924 2570 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831926 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831930 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831938 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831942 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831946 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831949 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831952 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831955 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831958 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831962 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831965 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831969 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:56.834396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831972 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831975 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831978 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831981 2570 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831984 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831989 2570 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831992 2570 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831995 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.831998 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832002 2570 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832006 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832009 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832012 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832015 2570 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832018 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832021 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832024 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832027 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832030 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832033 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832036 2570 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832040 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832044 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832048 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832066 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832070 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:56.835000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832074 2570 flags.go:64] FLAG: --help="false" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832077 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-129-34.ec2.internal" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832081 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832084 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832087 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832090 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832094 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832097 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832100 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832103 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832106 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832115 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832118 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832121 2570 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832124 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832127 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832130 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832133 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832136 2570 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832139 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832142 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832146 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832152 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:56.835655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832155 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832158 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832160 2570 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832163 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832167 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832171 2570 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832174 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832179 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832189 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832193 2570 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832196 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832199 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832202 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832205 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832208 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832211 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832214 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832224 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832227 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832230 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832233 2570 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832236 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832242 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832245 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:56.836289 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832249 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832252 2570 flags.go:64] FLAG: --port="10250" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832255 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832258 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-040fcbf53d74559d0" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832261 2570 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832264 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832267 2570 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832270 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832273 2570 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832277 2570 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832280 2570 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832283 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832286 2570 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832289 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832294 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832297 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832300 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832304 2570 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832307 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832310 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832313 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832316 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832319 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832322 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832326 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832329 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:56.836878 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832333 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832336 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832339 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832342 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832345 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832348 2570 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832353 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832359 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832362 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832365 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832370 2570 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832373 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832376 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832378 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832381 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832384 2570 flags.go:64] FLAG: --v="2" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832389 2570 flags.go:64] FLAG: --version="false" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832393 2570 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832397 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.832400 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832508 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832512 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832516 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832518 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.837518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832522 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832524 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832527 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832529 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832532 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832534 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832537 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832539 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832543 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832546 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832548 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832551 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832554 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832556 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832560 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832563 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832566 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832568 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832571 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832573 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.838104 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832576 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832579 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832581 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832584 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832586 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832589 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832591 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832594 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832597 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832600 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832602 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832606 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832610 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832613 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832616 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832619 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832621 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832624 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832627 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.838623 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832629 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832632 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832634 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832637 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832639 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832642 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832645 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832648 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832652 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832656 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832659 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832662 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832665 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832667 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832670 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832672 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832675 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832677 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832680 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832682 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.839105 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832685 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832687 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832690 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832693 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832695 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832697 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832700 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832703 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832705 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832708 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832710 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832713 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832715 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832717 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832720 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832722 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832725 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832727 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832730 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832734 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.839602 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832737 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832739 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.832742 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.833441 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.839875 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.839895 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839945 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839950 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839954 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839957 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839960 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839964 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839967 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839970 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839972 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839975 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.840120 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839979 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839983 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839988 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839991 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839994 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.839997 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840000 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840002 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840005 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840009 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840012 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840015 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840017 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840020 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840023 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840027 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840032 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840035 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840038 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.840518 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840041 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840043 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840048 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840051 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840072 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840074 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840077 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840080 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840083 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840086 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840089 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840091 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840095 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840097 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840100 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840102 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840105 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840107 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840110 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840112 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.840971 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840115 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840118 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840121 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840123 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840126 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840129 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840131 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840134 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840136 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840139 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840142 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840144 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840147 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840149 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840152 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840156 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840159 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840161 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840164 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840166 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.841563 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840169 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840172 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840174 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840177 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840179 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840182 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840184 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840187 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840189 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840192 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840194 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840196 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840199 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840202 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840204 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840207 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.842143 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840209 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.840214 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840315 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840320 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840323 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840326 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840329 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840331 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840334 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840336 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840340 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840343 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840346 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840349 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840352 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840354 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:56.842628 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840356 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840359 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840361 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840363 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840366 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840369 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840371 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840374 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840378 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840382 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840385 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840389 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840392 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840395 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840397 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840400 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840403 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840406 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840409 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:56.843035 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840411 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840414 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840416 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840419 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840421 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840424 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840426 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840429 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840432 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840434 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840438 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840440 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840443 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840446 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840448 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840451 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840453 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840456 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840458 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840461 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:56.843540 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840463 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840466 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840468 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840471 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840474 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840477 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840479 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840482 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840484 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840487 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840489 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840492 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840494 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840497 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840499 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840502 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840504 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840506 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840509 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840511 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:56.844040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840514 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840517 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840520 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840523 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840525 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840528 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840531 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840533 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840535 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840538 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840540 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840543 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:56.840546 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.840551 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.841383 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:56.844548 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.844113 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:56.846861 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.846847 2570 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:56.846966 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.846947 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:56.847857 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.847845 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:56.876472 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.876344 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:56.883192 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.883166 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:56.898900 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.898878 2570 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:56.904440 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.904421 2570 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:56.905630 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.905605 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:56.905919 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.905902 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:56.907887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.907866 2570 fs.go:135] Filesystem UUIDs: map[1a0d9d3c-bce1-4f13-8ba3-a066718c2b3c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a861e37e-b527-4f8d-b8dd-e4f5b00d50f3:/dev/nvme0n1p3] Apr 16 19:53:56.907965 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.907885 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:56.912990 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.912867 2570 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:56.911601506 +0000 UTC m=+0.446467533 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100349 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27529c54d1aeb468946703a4822341 SystemUUID:ec27529c-54d1-aeb4-6894-6703a4822341 BootID:f2d2c06c-014d-462f-a49a-13268a5e2d18 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bd:d8:96:b4:81 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bd:d8:96:b4:81 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:ef:a6:11:c5:9d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:56.912990 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.912985 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:56.913136 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.913124 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:56.915156 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.915131 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:56.915294 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.915158 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-34.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:56.915346 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.915304 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:56.915346 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.915312 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:56.915346 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.915325 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:56.915346 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.915345 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:56.916753 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.916741 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:56.916863 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.916854 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:56.919607 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.919597 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:56.919687 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.919615 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:56.919687 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.919626 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:56.919687 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.919636 2570 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:56.919687 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.919658 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:56.920848 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.920834 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:56.920906 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.920854 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:56.926961 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.926943 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:56.929177 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.929159 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:56.930493 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930481 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930498 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930504 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930510 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930516 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930522 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930528 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930533 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930540 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:56.930544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930546 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:56.930804 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930560 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:56.930804 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.930570 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:56.931463 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.931449 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:56.931463 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.931464 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:56.933179 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:56.933155 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-34.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:56.933256 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:56.933230 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:56.933971 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.933953 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-grhc9" Apr 16 19:53:56.935178 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.935163 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:56.935260 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.935207 2570 server.go:1295] "Started kubelet" Apr 16 19:53:56.935314 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.935284 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:56.935380 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.935344 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:56.935431 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.935397 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:56.936166 ip-10-0-129-34 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:56.937074 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.936939 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:56.937344 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.937332 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:56.941713 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.941691 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:56.941954 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.941940 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-grhc9" Apr 16 19:53:56.942575 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.942555 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:56.943307 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943289 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:56.943307 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943292 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:56.943441 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943317 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:56.943441 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943388 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:56.943441 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943399 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:56.943441 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943417 2570 factory.go:55] Registering systemd factory Apr 16 19:53:56.943441 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943438 2570 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:56.943669 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:56.943546 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:56.943797 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943783 2570 factory.go:153] Registering CRI-O factory Apr 16 19:53:56.943856 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943813 2570 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:56.943901 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943859 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:56.943901 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943885 2570 factory.go:103] Registering Raw factory Apr 16 19:53:56.943901 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.943901 2570 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:56.944403 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.944384 2570 manager.go:319] Starting recovery of all containers Apr 16 19:53:56.946873 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:56.946596 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:56.948478 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:56.948447 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-34.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:53:56.948673 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.948648 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-34.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:53:56.948841 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:56.948814 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:53:56.949669 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:56.948462 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-34.ec2.internal.18a6ee68cd73f227 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-34.ec2.internal,UID:ip-10-0-129-34.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-34.ec2.internal,},FirstTimestamp:2026-04-16 19:53:56.935176743 +0000 UTC m=+0.470042770,LastTimestamp:2026-04-16 19:53:56.935176743 +0000 UTC m=+0.470042770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-34.ec2.internal,}" Apr 16 19:53:56.957931 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.957911 2570 manager.go:324] Recovery completed Apr 16 19:53:56.962050 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.962034 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.964849 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.964830 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.964921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.964864 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.964921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.964878 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.965418 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.965405 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:56.965418 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.965417 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:56.965514 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.965432 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:56.968630 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.968613 2570 policy_none.go:49] "None policy: Start" Apr 16 19:53:56.968630 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.968631 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:56.968752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:56.968641 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.013681 2570 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.013720 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.013734 2570 server.go:85] "Starting device plugin registration server" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.014019 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.014032 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.014152 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.014280 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.014290 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.014714 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:57.020708 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.014752 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.104995 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.104917 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:57.106233 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.106204 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:57.106233 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.106236 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:57.106374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.106267 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:57.106374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.106274 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:57.106459 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.106375 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:57.110605 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.110584 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.114258 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.114246 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.115011 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.114994 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.115097 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.115022 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.115097 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.115033 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.115097 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.115069 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.120699 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.120684 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.120769 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.120705 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-34.ec2.internal\": node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.163656 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.163624 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.207378 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.207330 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal"] Apr 16 19:53:57.207490 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.207428 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.208467 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.208445 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.208575 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.208484 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.208575 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.208498 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.210839 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.210820 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.210990 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.210972 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.211092 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.211005 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.211606 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.211587 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.211671 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.211589 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.211671 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.211642 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.211671 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.211653 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.211671 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.211619 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.211818 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.211693 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.213921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.213904 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.213999 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.213928 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.214620 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.214602 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.214701 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.214628 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.214701 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.214640 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.238704 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.238676 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-34.ec2.internal\" not found" node="ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.242371 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.242351 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-34.ec2.internal\" not found" node="ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.244313 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.244293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84b73010dde18c2e537db575f397d1b5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal\" (UID: \"84b73010dde18c2e537db575f397d1b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.244397 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.244322 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/52f7ef5b748605fa2e3167b9e181ddfa-config\") pod \"kube-apiserver-proxy-ip-10-0-129-34.ec2.internal\" (UID: \"52f7ef5b748605fa2e3167b9e181ddfa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.244397 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.244349 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/84b73010dde18c2e537db575f397d1b5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal\" (UID: \"84b73010dde18c2e537db575f397d1b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.264280 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.264256 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.344621 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.344590 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/84b73010dde18c2e537db575f397d1b5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal\" (UID: \"84b73010dde18c2e537db575f397d1b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.344621 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.344620 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84b73010dde18c2e537db575f397d1b5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal\" (UID: \"84b73010dde18c2e537db575f397d1b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.344821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.344639 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/52f7ef5b748605fa2e3167b9e181ddfa-config\") pod \"kube-apiserver-proxy-ip-10-0-129-34.ec2.internal\" (UID: \"52f7ef5b748605fa2e3167b9e181ddfa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.344821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.344668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84b73010dde18c2e537db575f397d1b5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal\" (UID: \"84b73010dde18c2e537db575f397d1b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.344821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.344675 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/84b73010dde18c2e537db575f397d1b5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal\" (UID: \"84b73010dde18c2e537db575f397d1b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.344821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.344709 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/52f7ef5b748605fa2e3167b9e181ddfa-config\") pod \"kube-apiserver-proxy-ip-10-0-129-34.ec2.internal\" (UID: \"52f7ef5b748605fa2e3167b9e181ddfa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.364842 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.364791 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.465611 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.465580 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.540816 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.540778 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.544679 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.544659 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.566502 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.566469 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.667003 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.666908 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.767522 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.767489 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.846860 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.846815 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:57.847557 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.846970 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:57.867606 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.867579 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-34.ec2.internal\" not found" Apr 16 19:53:57.870354 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.870338 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.904286 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.904259 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.920595 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.920543 2570 apiserver.go:52] "Watching apiserver" Apr 16 19:53:57.935906 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.935882 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:57.936303 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.936278 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-rt6tz","openshift-network-operator/iptables-alerter-ktbn6","openshift-cluster-node-tuning-operator/tuned-b4vm2","openshift-ovn-kubernetes/ovnkube-node-t5tmb","kube-system/konnectivity-agent-9ktg6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd","openshift-image-registry/node-ca-2f4zv","openshift-multus/multus-additional-cni-plugins-ljvqz","openshift-multus/multus-njxv9","openshift-multus/network-metrics-daemon-8x8wb"] Apr 16 19:53:57.940755 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.940720 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:53:57.940863 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.940816 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:53:57.940863 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.940850 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:57.941777 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.941756 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:57.943215 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.942969 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.944082 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.943392 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.945114 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.945090 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:56 +0000 UTC" deadline="2027-11-03 03:24:29.294728747 +0000 UTC" Apr 16 19:53:57.945160 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.945113 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13567h30m31.349617704s" Apr 16 19:53:57.945253 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.945240 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.947543 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-sys\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.947648 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947628 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-run-netns\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.947704 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947664 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-cni-netd\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.947750 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947692 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.947750 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947739 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-lib-modules\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.947837 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947769 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:53:57.947837 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947783 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxprd\" (UniqueName: \"kubernetes.io/projected/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-kube-api-access-kxprd\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.947837 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947820 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-ovn\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.947970 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.947839 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-run-ovn-kubernetes\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948020 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovn-node-metrics-cert\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948397 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948379 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-modprobe-d\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.948488 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948412 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysconfig\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.948488 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948428 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-env-overrides\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948488 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-tuned\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.948488 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948456 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948488 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948477 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-node-log\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948498 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-cni-bin\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948522 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovnkube-config\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-kubelet\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948587 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-slash\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-var-lib-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948658 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d600b8d5-af76-4864-85f2-894bc334d737-host-slash\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948696 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-systemd\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-etc-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948731 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d600b8d5-af76-4864-85f2-894bc334d737-iptables-alerter-script\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:57.948752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948745 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wwmm\" (UniqueName: \"kubernetes.io/projected/d600b8d5-af76-4864-85f2-894bc334d737-kube-api-access-6wwmm\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948764 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-var-lib-kubelet\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-host\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948792 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-run\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948805 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-tmp\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948824 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-log-socket\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948852 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovnkube-script-lib\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwc72\" (UniqueName: \"kubernetes.io/projected/1a71a363-02b5-43c8-ac58-44ba0eb22832-kube-api-access-gwc72\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948887 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-kubernetes\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948916 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysctl-d\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-systemd-units\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948945 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysctl-conf\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.949111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.948962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-systemd\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:57.951406 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.951389 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:57.951646 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.951630 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:57.951857 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.951833 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:57.951928 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.951880 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sr4qz\"" Apr 16 19:53:57.951928 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.951892 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zj89f\"" Apr 16 19:53:57.952028 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.951927 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:57.952028 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.951945 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:57.952028 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.951893 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bcbns\"" Apr 16 19:53:57.952028 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.952017 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vkgx9\"" Apr 16 19:53:57.952243 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.952018 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:57.952243 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.952078 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:57.952243 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.952070 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:57.952527 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.952506 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:57.952527 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.952516 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:57.952693 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.952580 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:57.952693 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.952672 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:57.953521 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.953504 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:57.955541 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.955523 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:57.955837 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.955815 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:57.957494 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.957478 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:57.957894 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.957879 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:57.958015 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.957997 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-njxv9" Apr 16 19:53:57.958157 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.958137 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sh69q\"" Apr 16 19:53:57.960166 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.960149 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:53:57.960238 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:57.960202 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:53:57.960990 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.960960 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:57.961860 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.961844 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:57.963261 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.963246 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:57.963261 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.963255 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:57.963580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.963568 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4dx9c\"" Apr 16 19:53:57.963827 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.963811 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:57.963903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.963866 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gjj4h\"" Apr 16 19:53:57.970490 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.970471 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:57.970747 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.970732 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:57.970963 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.970939 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:57.977133 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.977116 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:57.977881 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.977859 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:57.979267 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.979251 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:57.982013 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.981998 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal"] Apr 16 19:53:57.982307 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.982289 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:57.982377 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.982365 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" Apr 16 19:53:57.985068 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.985041 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:57.986658 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:57.986645 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-75vmn\"" Apr 16 19:53:58.007897 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.007865 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:58.008035 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.007990 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal"] Apr 16 19:53:58.034036 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.034016 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qfnk6" Apr 16 19:53:58.044229 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.044206 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:58.050070 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-host\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050174 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050126 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-host\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050237 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050176 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-run\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050237 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-log-socket\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.050332 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050249 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-run\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050332 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwc72\" (UniqueName: \"kubernetes.io/projected/1a71a363-02b5-43c8-ac58-44ba0eb22832-kube-api-access-gwc72\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.050332 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050281 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-socket-dir-parent\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.050332 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050299 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94prt\" (UniqueName: \"kubernetes.io/projected/1ceab864-ede8-473b-8607-10b5f8b271d4-kube-api-access-94prt\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050348 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-log-socket\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050383 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-system-cni-dir\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050404 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-kubernetes\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysctl-d\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050472 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysctl-conf\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-systemd\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-sys\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-kubernetes\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-run-netns\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050563 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050586 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysctl-d\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050599 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-systemd\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050609 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-system-cni-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050613 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysctl-conf\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-sys\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050667 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-run-netns\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050690 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-conf-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050720 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-cnibin\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050809 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-lib-modules\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050845 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxprd\" (UniqueName: \"kubernetes.io/projected/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-kube-api-access-kxprd\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.050903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050875 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-ovn\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050926 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-lib-modules\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-ovn\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.050981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovn-node-metrics-cert\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051009 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-cni-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051026 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8nst\" (UniqueName: \"kubernetes.io/projected/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-kube-api-access-g8nst\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysconfig\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051101 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-sysconfig\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-cnibin\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-daemon-config\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051141 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-registration-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051158 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-tuned\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051173 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-node-log\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051188 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-cni-bin\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovnkube-config\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051221 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-node-log\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051229 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-os-release\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.051491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051262 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051262 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-cni-bin\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051289 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-slash\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051316 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-var-lib-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051340 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d600b8d5-af76-4864-85f2-894bc334d737-host-slash\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051329 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051368 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-slash\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-serviceca\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051392 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-var-lib-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-socket-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051417 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d600b8d5-af76-4864-85f2-894bc334d737-host-slash\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051436 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rts7p\" (UniqueName: \"kubernetes.io/projected/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-kube-api-access-rts7p\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-kubelet\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-var-lib-kubelet\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-tmp\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovnkube-script-lib\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051550 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-var-lib-kubelet\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.052318 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051570 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-etc-selinux\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-sys-fs\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051623 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-systemd-units\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051666 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ceab864-ede8-473b-8607-10b5f8b271d4-cni-binary-copy\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051684 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-systemd-units\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-cni-bin\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051775 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-cni-netd\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051800 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95l5b\" (UniqueName: \"kubernetes.io/projected/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-kube-api-access-95l5b\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051846 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-multus-certs\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051878 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051894 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovnkube-config\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051915 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qfnk6" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051929 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-run-ovn-kubernetes\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051955 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjgq\" (UniqueName: \"kubernetes.io/projected/405a22ed-e497-47a7-95e5-0362e26a6e43-kube-api-access-zdjgq\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.051997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-modprobe-d\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.053099 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052022 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-env-overrides\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052041 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-run-ovn-kubernetes\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052118 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-cni-netd\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052117 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovnkube-script-lib\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-host\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052168 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-cni-multus\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052196 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-hostroot\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052206 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-modprobe-d\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052221 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052248 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0114be71-3ca1-48b4-bcff-512d02284f83-konnectivity-ca\") pod \"konnectivity-agent-9ktg6\" (UID: \"0114be71-3ca1-48b4-bcff-512d02284f83\") " pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-kubelet\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052301 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052313 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052326 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-os-release\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-host-kubelet\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052349 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-k8s-cni-cncf-io\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-netns\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.053578 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052453 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a71a363-02b5-43c8-ac58-44ba0eb22832-env-overrides\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052473 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-etc-kubernetes\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-systemd\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052534 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-etc-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052560 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d600b8d5-af76-4864-85f2-894bc334d737-iptables-alerter-script\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052587 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wwmm\" (UniqueName: \"kubernetes.io/projected/d600b8d5-af76-4864-85f2-894bc334d737-kube-api-access-6wwmm\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-etc-openvswitch\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052617 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a71a363-02b5-43c8-ac58-44ba0eb22832-run-systemd\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052659 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0114be71-3ca1-48b4-bcff-512d02284f83-agent-certs\") pod \"konnectivity-agent-9ktg6\" (UID: \"0114be71-3ca1-48b4-bcff-512d02284f83\") " pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.052688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-device-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.054080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.053071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d600b8d5-af76-4864-85f2-894bc334d737-iptables-alerter-script\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:58.054549 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.054504 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-tmp\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.054614 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.054549 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-etc-tuned\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.054891 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.054871 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a71a363-02b5-43c8-ac58-44ba0eb22832-ovn-node-metrics-cert\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.066752 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.066725 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:58.066860 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.066758 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:58.066860 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.066770 2570 projected.go:194] Error preparing data for projected volume kube-api-access-l96v5 for pod openshift-network-diagnostics/network-check-target-rt6tz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.066951 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.066871 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5 podName:e33890cf-1250-4378-b5a9-3ef264f23dad nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.566824172 +0000 UTC m=+2.101690193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l96v5" (UniqueName: "kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5") pod "network-check-target-rt6tz" (UID: "e33890cf-1250-4378-b5a9-3ef264f23dad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.068510 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.068484 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxprd\" (UniqueName: \"kubernetes.io/projected/3d50365e-fb34-48f7-a1c1-833ed9e44ff1-kube-api-access-kxprd\") pod \"tuned-b4vm2\" (UID: \"3d50365e-fb34-48f7-a1c1-833ed9e44ff1\") " pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.068844 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.068821 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwc72\" (UniqueName: \"kubernetes.io/projected/1a71a363-02b5-43c8-ac58-44ba0eb22832-kube-api-access-gwc72\") pod \"ovnkube-node-t5tmb\" (UID: \"1a71a363-02b5-43c8-ac58-44ba0eb22832\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.069394 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.069375 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wwmm\" (UniqueName: \"kubernetes.io/projected/d600b8d5-af76-4864-85f2-894bc334d737-kube-api-access-6wwmm\") pod \"iptables-alerter-ktbn6\" (UID: \"d600b8d5-af76-4864-85f2-894bc334d737\") " pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:58.153271 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153241 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:53:58.153399 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjgq\" (UniqueName: \"kubernetes.io/projected/405a22ed-e497-47a7-95e5-0362e26a6e43-kube-api-access-zdjgq\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.153399 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-host\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.153399 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153336 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-cni-multus\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153399 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153364 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-hostroot\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153399 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.153371 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.153399 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0114be71-3ca1-48b4-bcff-512d02284f83-konnectivity-ca\") pod \"konnectivity-agent-9ktg6\" (UID: \"0114be71-3ca1-48b4-bcff-512d02284f83\") " pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153429 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-os-release\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.153442 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs podName:4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.653423536 +0000 UTC m=+2.188289551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs") pod "network-metrics-daemon-8x8wb" (UID: "4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153447 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-host\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153452 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-hostroot\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-cni-multus\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-k8s-cni-cncf-io\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-os-release\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153520 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-netns\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153527 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-k8s-cni-cncf-io\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-etc-kubernetes\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153572 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0114be71-3ca1-48b4-bcff-512d02284f83-agent-certs\") pod \"konnectivity-agent-9ktg6\" (UID: \"0114be71-3ca1-48b4-bcff-512d02284f83\") " pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153579 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-netns\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-etc-kubernetes\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153596 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-device-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.153654 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153637 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-socket-dir-parent\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94prt\" (UniqueName: \"kubernetes.io/projected/1ceab864-ede8-473b-8607-10b5f8b271d4-kube-api-access-94prt\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-device-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-system-cni-dir\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153715 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-socket-dir-parent\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153733 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153739 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-system-cni-dir\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153763 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-system-cni-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153816 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-conf-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-cnibin\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153875 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-cni-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8nst\" (UniqueName: \"kubernetes.io/projected/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-kube-api-access-g8nst\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-conf-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-cnibin\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153940 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-system-cni-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153954 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-daemon-config\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.154355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-registration-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.153983 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-cni-dir\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0114be71-3ca1-48b4-bcff-512d02284f83-konnectivity-ca\") pod \"konnectivity-agent-9ktg6\" (UID: \"0114be71-3ca1-48b4-bcff-512d02284f83\") " pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-os-release\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154079 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154087 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-os-release\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154252 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-registration-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154269 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-cnibin\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-serviceca\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-socket-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154368 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154382 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rts7p\" (UniqueName: \"kubernetes.io/projected/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-kube-api-access-rts7p\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-kubelet\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-etc-selinux\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-sys-fs\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154473 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-socket-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ceab864-ede8-473b-8607-10b5f8b271d4-cni-binary-copy\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154517 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-cni-bin\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154525 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-kubelet\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154559 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154616 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ceab864-ede8-473b-8607-10b5f8b271d4-multus-daemon-config\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154611 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95l5b\" (UniqueName: \"kubernetes.io/projected/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-kube-api-access-95l5b\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154656 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-multus-certs\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154691 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-var-lib-cni-bin\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-serviceca\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154749 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-etc-selinux\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154790 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154820 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154823 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/405a22ed-e497-47a7-95e5-0362e26a6e43-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154818 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-sys-fs\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.154857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ceab864-ede8-473b-8607-10b5f8b271d4-host-run-multus-certs\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.155143 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/405a22ed-e497-47a7-95e5-0362e26a6e43-cnibin\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.155684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.155221 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ceab864-ede8-473b-8607-10b5f8b271d4-cni-binary-copy\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.156294 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.156221 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0114be71-3ca1-48b4-bcff-512d02284f83-agent-certs\") pod \"konnectivity-agent-9ktg6\" (UID: \"0114be71-3ca1-48b4-bcff-512d02284f83\") " pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:53:58.156579 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.156544 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f7ef5b748605fa2e3167b9e181ddfa.slice/crio-ffe3d8e4b3f1b5d19496a40094178756c8d293473dc47e320c5c3d11785e37fd WatchSource:0}: Error finding container ffe3d8e4b3f1b5d19496a40094178756c8d293473dc47e320c5c3d11785e37fd: Status 404 returned error can't find the container with id ffe3d8e4b3f1b5d19496a40094178756c8d293473dc47e320c5c3d11785e37fd Apr 16 19:53:58.157145 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.157124 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84b73010dde18c2e537db575f397d1b5.slice/crio-eb0b30439b476496a8aca08129b71559fc0bd79b1e9b96a14dca064746902438 WatchSource:0}: Error finding container eb0b30439b476496a8aca08129b71559fc0bd79b1e9b96a14dca064746902438: Status 404 returned error can't find the container with id eb0b30439b476496a8aca08129b71559fc0bd79b1e9b96a14dca064746902438 Apr 16 19:53:58.162933 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.162918 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:58.163254 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.163239 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjgq\" (UniqueName: \"kubernetes.io/projected/405a22ed-e497-47a7-95e5-0362e26a6e43-kube-api-access-zdjgq\") pod \"multus-additional-cni-plugins-ljvqz\" (UID: \"405a22ed-e497-47a7-95e5-0362e26a6e43\") " pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.166267 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.166245 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94prt\" (UniqueName: \"kubernetes.io/projected/1ceab864-ede8-473b-8607-10b5f8b271d4-kube-api-access-94prt\") pod \"multus-njxv9\" (UID: \"1ceab864-ede8-473b-8607-10b5f8b271d4\") " pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.167266 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.167245 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95l5b\" (UniqueName: \"kubernetes.io/projected/d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b-kube-api-access-95l5b\") pod \"node-ca-2f4zv\" (UID: \"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b\") " pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.167565 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.167550 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8nst\" (UniqueName: \"kubernetes.io/projected/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-kube-api-access-g8nst\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:53:58.169494 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.169477 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rts7p\" (UniqueName: \"kubernetes.io/projected/5ad55dd7-a35c-4704-8ecf-446e4cc0c66f-kube-api-access-rts7p\") pod \"aws-ebs-csi-driver-node-gwhqd\" (UID: \"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.264527 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.264474 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ktbn6" Apr 16 19:53:58.270991 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.270966 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd600b8d5_af76_4864_85f2_894bc334d737.slice/crio-64bb250a712a977e5af5198f81c6e3256074f088da43db3b23ff9dce7c4d99fb WatchSource:0}: Error finding container 64bb250a712a977e5af5198f81c6e3256074f088da43db3b23ff9dce7c4d99fb: Status 404 returned error can't find the container with id 64bb250a712a977e5af5198f81c6e3256074f088da43db3b23ff9dce7c4d99fb Apr 16 19:53:58.299202 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.299175 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" Apr 16 19:53:58.304797 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.304767 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d50365e_fb34_48f7_a1c1_833ed9e44ff1.slice/crio-2b220e550dd476d2b0274d90b272417f63426938cc171a0bde23e97442a0df85 WatchSource:0}: Error finding container 2b220e550dd476d2b0274d90b272417f63426938cc171a0bde23e97442a0df85: Status 404 returned error can't find the container with id 2b220e550dd476d2b0274d90b272417f63426938cc171a0bde23e97442a0df85 Apr 16 19:53:58.322781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.322760 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:53:58.328802 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.328762 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a71a363_02b5_43c8_ac58_44ba0eb22832.slice/crio-b7ffc3525f820ce984efa5d69eab5231a63365e6e0dd7441b80016d888552639 WatchSource:0}: Error finding container b7ffc3525f820ce984efa5d69eab5231a63365e6e0dd7441b80016d888552639: Status 404 returned error can't find the container with id b7ffc3525f820ce984efa5d69eab5231a63365e6e0dd7441b80016d888552639 Apr 16 19:53:58.344115 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.344092 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:53:58.349919 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.349896 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0114be71_3ca1_48b4_bcff_512d02284f83.slice/crio-6145f92b674b0459a1af8cb2e507bfe39bcc2265f0fc9153e1faeb90c6978f5e WatchSource:0}: Error finding container 6145f92b674b0459a1af8cb2e507bfe39bcc2265f0fc9153e1faeb90c6978f5e: Status 404 returned error can't find the container with id 6145f92b674b0459a1af8cb2e507bfe39bcc2265f0fc9153e1faeb90c6978f5e Apr 16 19:53:58.366151 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.366124 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" Apr 16 19:53:58.372329 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.372306 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad55dd7_a35c_4704_8ecf_446e4cc0c66f.slice/crio-d1f966536bce426671d58c5a67fc10c2ede4a894727a213704d6638ee81807d4 WatchSource:0}: Error finding container d1f966536bce426671d58c5a67fc10c2ede4a894727a213704d6638ee81807d4: Status 404 returned error can't find the container with id d1f966536bce426671d58c5a67fc10c2ede4a894727a213704d6638ee81807d4 Apr 16 19:53:58.387416 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.387395 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:58.390557 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.390539 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2f4zv" Apr 16 19:53:58.396153 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.396126 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2fc2ca6_b29a_4acb_90f2_20b9a6a8854b.slice/crio-4835732a1052a6e1ac3e9b78d09ef3c281a4d3b2c062c5ea94a5c7cb701a3d9c WatchSource:0}: Error finding container 4835732a1052a6e1ac3e9b78d09ef3c281a4d3b2c062c5ea94a5c7cb701a3d9c: Status 404 returned error can't find the container with id 4835732a1052a6e1ac3e9b78d09ef3c281a4d3b2c062c5ea94a5c7cb701a3d9c Apr 16 19:53:58.405427 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.405412 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" Apr 16 19:53:58.409988 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.409968 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-njxv9" Apr 16 19:53:58.411182 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.411158 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod405a22ed_e497_47a7_95e5_0362e26a6e43.slice/crio-543304cb9c9aebfc4af4e53fd26dc042580d8e99d1ed10c7d7c312d0e19a9204 WatchSource:0}: Error finding container 543304cb9c9aebfc4af4e53fd26dc042580d8e99d1ed10c7d7c312d0e19a9204: Status 404 returned error can't find the container with id 543304cb9c9aebfc4af4e53fd26dc042580d8e99d1ed10c7d7c312d0e19a9204 Apr 16 19:53:58.417440 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:53:58.417420 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ceab864_ede8_473b_8607_10b5f8b271d4.slice/crio-3564cb701ee92544bb6f422e03259afb9f468fdd3a116e3566fcdcfaa13e7e8f WatchSource:0}: Error finding container 3564cb701ee92544bb6f422e03259afb9f468fdd3a116e3566fcdcfaa13e7e8f: Status 404 returned error can't find the container with id 3564cb701ee92544bb6f422e03259afb9f468fdd3a116e3566fcdcfaa13e7e8f Apr 16 19:53:58.657985 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.657891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:53:58.658164 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:58.657990 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:53:58.658164 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.658123 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.658305 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.658186 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs podName:4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:59.658166485 +0000 UTC m=+3.193032523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs") pod "network-metrics-daemon-8x8wb" (UID: "4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.658599 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.658579 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:58.658679 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.658605 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:58.658679 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.658620 2570 projected.go:194] Error preparing data for projected volume kube-api-access-l96v5 for pod openshift-network-diagnostics/network-check-target-rt6tz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.658679 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:58.658664 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5 podName:e33890cf-1250-4378-b5a9-3ef264f23dad nodeName:}" failed. No retries permitted until 2026-04-16 19:53:59.658649542 +0000 UTC m=+3.193515562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l96v5" (UniqueName: "kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5") pod "network-check-target-rt6tz" (UID: "e33890cf-1250-4378-b5a9-3ef264f23dad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.053031 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.052916 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:58 +0000 UTC" deadline="2027-12-24 18:48:56.628236826 +0000 UTC" Apr 16 19:53:59.053031 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.052960 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14806h54m57.575281164s" Apr 16 19:53:59.106874 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.106842 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:53:59.107075 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:59.106981 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:53:59.129820 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.129759 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" event={"ID":"405a22ed-e497-47a7-95e5-0362e26a6e43","Type":"ContainerStarted","Data":"543304cb9c9aebfc4af4e53fd26dc042580d8e99d1ed10c7d7c312d0e19a9204"} Apr 16 19:53:59.135895 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.135865 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2f4zv" event={"ID":"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b","Type":"ContainerStarted","Data":"4835732a1052a6e1ac3e9b78d09ef3c281a4d3b2c062c5ea94a5c7cb701a3d9c"} Apr 16 19:53:59.147910 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.147837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" event={"ID":"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f","Type":"ContainerStarted","Data":"d1f966536bce426671d58c5a67fc10c2ede4a894727a213704d6638ee81807d4"} Apr 16 19:53:59.157537 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.157496 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9ktg6" event={"ID":"0114be71-3ca1-48b4-bcff-512d02284f83","Type":"ContainerStarted","Data":"6145f92b674b0459a1af8cb2e507bfe39bcc2265f0fc9153e1faeb90c6978f5e"} Apr 16 19:53:59.175789 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.175753 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"b7ffc3525f820ce984efa5d69eab5231a63365e6e0dd7441b80016d888552639"} Apr 16 19:53:59.191206 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.191168 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" event={"ID":"3d50365e-fb34-48f7-a1c1-833ed9e44ff1","Type":"ContainerStarted","Data":"2b220e550dd476d2b0274d90b272417f63426938cc171a0bde23e97442a0df85"} Apr 16 19:53:59.204573 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.204535 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-njxv9" event={"ID":"1ceab864-ede8-473b-8607-10b5f8b271d4","Type":"ContainerStarted","Data":"3564cb701ee92544bb6f422e03259afb9f468fdd3a116e3566fcdcfaa13e7e8f"} Apr 16 19:53:59.221224 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.221109 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ktbn6" event={"ID":"d600b8d5-af76-4864-85f2-894bc334d737","Type":"ContainerStarted","Data":"64bb250a712a977e5af5198f81c6e3256074f088da43db3b23ff9dce7c4d99fb"} Apr 16 19:53:59.249390 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.249267 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" event={"ID":"52f7ef5b748605fa2e3167b9e181ddfa","Type":"ContainerStarted","Data":"ffe3d8e4b3f1b5d19496a40094178756c8d293473dc47e320c5c3d11785e37fd"} Apr 16 19:53:59.257345 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.257274 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" event={"ID":"84b73010dde18c2e537db575f397d1b5","Type":"ContainerStarted","Data":"eb0b30439b476496a8aca08129b71559fc0bd79b1e9b96a14dca064746902438"} Apr 16 19:53:59.414358 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.414274 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:59.669740 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.669665 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:53:59.669740 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:53:59.669719 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:53:59.669945 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:59.669855 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:59.669945 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:59.669871 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:59.669945 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:59.669884 2570 projected.go:194] Error preparing data for projected volume kube-api-access-l96v5 for pod openshift-network-diagnostics/network-check-target-rt6tz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.669945 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:59.669945 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5 podName:e33890cf-1250-4378-b5a9-3ef264f23dad nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.66992292 +0000 UTC m=+5.204788934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l96v5" (UniqueName: "kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5") pod "network-check-target-rt6tz" (UID: "e33890cf-1250-4378-b5a9-3ef264f23dad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.670728 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:59.670708 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:59.670837 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:53:59.670761 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs podName:4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.670746039 +0000 UTC m=+5.205612055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs") pod "network-metrics-daemon-8x8wb" (UID: "4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:00.053505 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:00.053413 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:58 +0000 UTC" deadline="2027-10-22 04:59:15.509225604 +0000 UTC" Apr 16 19:54:00.053505 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:00.053455 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13281h5m15.455775756s" Apr 16 19:54:00.107170 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:00.106971 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:00.107170 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:00.107143 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:01.107979 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:01.107460 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:01.107979 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:01.107589 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:01.687923 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:01.687154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:01.687923 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:01.687210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:01.687923 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:01.687347 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:01.687923 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:01.687363 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:01.687923 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:01.687376 2570 projected.go:194] Error preparing data for projected volume kube-api-access-l96v5 for pod openshift-network-diagnostics/network-check-target-rt6tz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.687923 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:01.687432 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5 podName:e33890cf-1250-4378-b5a9-3ef264f23dad nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.687413768 +0000 UTC m=+9.222279788 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l96v5" (UniqueName: "kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5") pod "network-check-target-rt6tz" (UID: "e33890cf-1250-4378-b5a9-3ef264f23dad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.687923 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:01.687832 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.687923 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:01.687888 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs podName:4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.687871051 +0000 UTC m=+9.222737086 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs") pod "network-metrics-daemon-8x8wb" (UID: "4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:02.106814 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:02.106777 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:02.106989 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:02.106918 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:03.107638 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:03.107183 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:03.107638 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:03.107322 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:04.106811 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.106716 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:04.107003 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:04.106859 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:04.496349 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.495969 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ck5pm"] Apr 16 19:54:04.505364 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.505334 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.509820 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.508984 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-r5qgr\"" Apr 16 19:54:04.509820 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.509325 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.509820 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.509711 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.609794 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.609751 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m82v7\" (UniqueName: \"kubernetes.io/projected/5246af7c-5c30-44f4-9bad-2723e872fdf5-kube-api-access-m82v7\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.609970 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.609837 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5246af7c-5c30-44f4-9bad-2723e872fdf5-tmp-dir\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.609970 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.609913 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5246af7c-5c30-44f4-9bad-2723e872fdf5-hosts-file\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.710980 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.710944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5246af7c-5c30-44f4-9bad-2723e872fdf5-hosts-file\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.710980 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.710989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m82v7\" (UniqueName: \"kubernetes.io/projected/5246af7c-5c30-44f4-9bad-2723e872fdf5-kube-api-access-m82v7\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.711220 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.711019 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5246af7c-5c30-44f4-9bad-2723e872fdf5-tmp-dir\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.711220 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.711182 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5246af7c-5c30-44f4-9bad-2723e872fdf5-hosts-file\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.711994 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.711963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5246af7c-5c30-44f4-9bad-2723e872fdf5-tmp-dir\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.723320 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.723283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m82v7\" (UniqueName: \"kubernetes.io/projected/5246af7c-5c30-44f4-9bad-2723e872fdf5-kube-api-access-m82v7\") pod \"node-resolver-ck5pm\" (UID: \"5246af7c-5c30-44f4-9bad-2723e872fdf5\") " pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:04.817471 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:04.817390 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ck5pm" Apr 16 19:54:05.107238 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:05.106627 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:05.107238 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:05.106765 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:05.720472 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:05.720421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:05.720946 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:05.720484 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:05.720946 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:05.720600 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:05.720946 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:05.720615 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:05.720946 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:05.720633 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:05.720946 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:05.720647 2570 projected.go:194] Error preparing data for projected volume kube-api-access-l96v5 for pod openshift-network-diagnostics/network-check-target-rt6tz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:05.720946 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:05.720676 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs podName:4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.720657053 +0000 UTC m=+17.255523076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs") pod "network-metrics-daemon-8x8wb" (UID: "4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:05.720946 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:05.720696 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5 podName:e33890cf-1250-4378-b5a9-3ef264f23dad nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.720685782 +0000 UTC m=+17.255551796 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l96v5" (UniqueName: "kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5") pod "network-check-target-rt6tz" (UID: "e33890cf-1250-4378-b5a9-3ef264f23dad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:06.107230 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:06.107141 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:06.107398 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:06.107287 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:07.108487 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:07.108313 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:07.108487 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:07.108428 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:08.106945 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:08.106857 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:08.107097 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:08.106993 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:09.107276 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:09.107242 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:09.107716 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:09.107364 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:10.107247 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:10.107213 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:10.107429 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:10.107345 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:11.106919 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:11.106879 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:11.107131 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:11.107016 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:12.107140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:12.107104 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:12.107554 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:12.107222 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:13.106749 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:13.106711 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:13.106925 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:13.106825 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:13.775943 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:13.775908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:13.776377 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:13.776005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:13.776377 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:13.776113 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:13.776377 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:13.776146 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:13.776377 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:13.776162 2570 projected.go:194] Error preparing data for projected volume kube-api-access-l96v5 for pod openshift-network-diagnostics/network-check-target-rt6tz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:13.776377 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:13.776178 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:13.776377 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:13.776211 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5 podName:e33890cf-1250-4378-b5a9-3ef264f23dad nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.776196682 +0000 UTC m=+33.311062696 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l96v5" (UniqueName: "kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5") pod "network-check-target-rt6tz" (UID: "e33890cf-1250-4378-b5a9-3ef264f23dad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:13.776377 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:13.776242 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs podName:4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.776216402 +0000 UTC m=+33.311082421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs") pod "network-metrics-daemon-8x8wb" (UID: "4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:14.107584 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:14.107498 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:14.107759 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:14.107638 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:15.107177 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:15.107142 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:15.107605 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:15.107259 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:15.884516 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:15.884089 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5246af7c_5c30_44f4_9bad_2723e872fdf5.slice/crio-0dc5da0fbbd40d5d7a94ead5e82f01435b9bfa17a7e3e906796639299ec6c38c WatchSource:0}: Error finding container 0dc5da0fbbd40d5d7a94ead5e82f01435b9bfa17a7e3e906796639299ec6c38c: Status 404 returned error can't find the container with id 0dc5da0fbbd40d5d7a94ead5e82f01435b9bfa17a7e3e906796639299ec6c38c Apr 16 19:54:16.107597 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.107312 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:16.108338 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:16.107689 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:16.305182 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.304706 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"f7b45dab4c786e31507ca3b391b4ae9d12fa8ee0abacbb0d4a1054e70a4405eb"} Apr 16 19:54:16.305182 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.304972 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"1a5930ffc65472fa94382a28df26e48f36001edfa1739e6ed66a3a2918a499c4"} Apr 16 19:54:16.305182 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.304987 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"723ad15d87f2a39431ccce791b1f1b2697c7c3179cd8c6290e9176f6ef2ad8d8"} Apr 16 19:54:16.305182 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.304997 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"72fc2bf27d414e9fae9a7a49d690e90c3e10a82d61b8aa16e1129a61b5003762"} Apr 16 19:54:16.306600 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.306573 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" event={"ID":"3d50365e-fb34-48f7-a1c1-833ed9e44ff1","Type":"ContainerStarted","Data":"b2d082830d056a6bb33fe42fcbdb4b4efd17a5e91f0c81c02287ed133f7a0b53"} Apr 16 19:54:16.308699 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.308660 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-njxv9" event={"ID":"1ceab864-ede8-473b-8607-10b5f8b271d4","Type":"ContainerStarted","Data":"86e42c7dbe9188099509eabb02738a854deac7a6debf6950bc457dec2d4e2a86"} Apr 16 19:54:16.311127 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.311089 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" event={"ID":"52f7ef5b748605fa2e3167b9e181ddfa","Type":"ContainerStarted","Data":"e467eb185a79d15a4adb7d978e15cb5853203b7ae1de30df6a802aa7d47eae4b"} Apr 16 19:54:16.312429 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.312405 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ck5pm" event={"ID":"5246af7c-5c30-44f4-9bad-2723e872fdf5","Type":"ContainerStarted","Data":"0dc5da0fbbd40d5d7a94ead5e82f01435b9bfa17a7e3e906796639299ec6c38c"} Apr 16 19:54:16.341389 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.341245 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b4vm2" podStartSLOduration=1.76920843 podStartE2EDuration="19.341229177s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.306225645 +0000 UTC m=+1.841091659" lastFinishedPulling="2026-04-16 19:54:15.878246379 +0000 UTC m=+19.413112406" observedRunningTime="2026-04-16 19:54:16.323927445 +0000 UTC m=+19.858793507" watchObservedRunningTime="2026-04-16 19:54:16.341229177 +0000 UTC m=+19.876095213" Apr 16 19:54:16.341623 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:16.341587 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-34.ec2.internal" podStartSLOduration=19.341574822 podStartE2EDuration="19.341574822s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:16.34076812 +0000 UTC m=+19.875634157" watchObservedRunningTime="2026-04-16 19:54:16.341574822 +0000 UTC m=+19.876440862" Apr 16 19:54:17.108840 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.108541 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:17.109345 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:17.108891 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:17.317252 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.317213 2570 generic.go:358] "Generic (PLEG): container finished" podID="84b73010dde18c2e537db575f397d1b5" containerID="5ab310a4df7862fa541653c3dde7a9f36bc9f5c57908c4b154074f1cfe307371" exitCode=0 Apr 16 19:54:17.317413 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.317303 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" event={"ID":"84b73010dde18c2e537db575f397d1b5","Type":"ContainerDied","Data":"5ab310a4df7862fa541653c3dde7a9f36bc9f5c57908c4b154074f1cfe307371"} Apr 16 19:54:17.318890 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.318861 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ck5pm" event={"ID":"5246af7c-5c30-44f4-9bad-2723e872fdf5","Type":"ContainerStarted","Data":"6fb4392c7d547f2d670252ecdf3c52c6e8482ffe8822a2ad3739ea053169ac8d"} Apr 16 19:54:17.320624 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.320596 2570 generic.go:358] "Generic (PLEG): container finished" podID="405a22ed-e497-47a7-95e5-0362e26a6e43" containerID="877b2132fa3bac1ad01f3d9e5fc518d07e6ce6c006ff9982f03a9f7ba677d8ca" exitCode=0 Apr 16 19:54:17.320741 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.320682 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" event={"ID":"405a22ed-e497-47a7-95e5-0362e26a6e43","Type":"ContainerDied","Data":"877b2132fa3bac1ad01f3d9e5fc518d07e6ce6c006ff9982f03a9f7ba677d8ca"} Apr 16 19:54:17.322284 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.322085 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2f4zv" event={"ID":"d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b","Type":"ContainerStarted","Data":"90cfbe1027eaa9ad0bc1b994ff078ccbee9c75db287b57bc82628e8a26faa28c"} Apr 16 19:54:17.323648 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.323605 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" event={"ID":"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f","Type":"ContainerStarted","Data":"64381dbe6454271aef189666804c6dcb1f8bcbf49a989a37349a25d012d126ea"} Apr 16 19:54:17.325103 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.325068 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9ktg6" event={"ID":"0114be71-3ca1-48b4-bcff-512d02284f83","Type":"ContainerStarted","Data":"6e2ca300716090a001f4536a37afea483420f7e284dc80c71bd46fc102630c1b"} Apr 16 19:54:17.328094 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.328071 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"2b0a01eedbcc4395cbd44e4137c12001a13cf1929cd9a96b441939f874d34bad"} Apr 16 19:54:17.328196 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.328101 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"c00d054b3cb3fcd3546b70a5dd7304763791eb40cb000f2bf15263c66d26bf52"} Apr 16 19:54:17.339678 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.339622 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-njxv9" podStartSLOduration=2.8630317720000003 podStartE2EDuration="20.339607398s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.419133802 +0000 UTC m=+1.953999816" lastFinishedPulling="2026-04-16 19:54:15.895709422 +0000 UTC m=+19.430575442" observedRunningTime="2026-04-16 19:54:16.36841597 +0000 UTC m=+19.903282007" watchObservedRunningTime="2026-04-16 19:54:17.339607398 +0000 UTC m=+20.874473438" Apr 16 19:54:17.356161 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.356104 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2f4zv" podStartSLOduration=2.875300517 podStartE2EDuration="20.356090912s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.397498098 +0000 UTC m=+1.932364113" lastFinishedPulling="2026-04-16 19:54:15.878288487 +0000 UTC m=+19.413154508" observedRunningTime="2026-04-16 19:54:17.356047855 +0000 UTC m=+20.890913891" watchObservedRunningTime="2026-04-16 19:54:17.356090912 +0000 UTC m=+20.890956947" Apr 16 19:54:17.410864 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.410773 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9ktg6" podStartSLOduration=2.883846018 podStartE2EDuration="20.410755729s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.351353967 +0000 UTC m=+1.886219981" lastFinishedPulling="2026-04-16 19:54:15.878263676 +0000 UTC m=+19.413129692" observedRunningTime="2026-04-16 19:54:17.394034143 +0000 UTC m=+20.928900178" watchObservedRunningTime="2026-04-16 19:54:17.410755729 +0000 UTC m=+20.945621765" Apr 16 19:54:17.653568 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.653539 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:17.813128 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.813001 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:54:17.813756 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.813733 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:54:17.828747 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:17.828698 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ck5pm" podStartSLOduration=13.828678907 podStartE2EDuration="13.828678907s" podCreationTimestamp="2026-04-16 19:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:17.411563874 +0000 UTC m=+20.946429909" watchObservedRunningTime="2026-04-16 19:54:17.828678907 +0000 UTC m=+21.363544942" Apr 16 19:54:18.026625 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.026458 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:17.653562846Z","UUID":"a45d5f48-9e97-4d29-b34e-b7e2ed4a16b9","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:18.028446 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.028421 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:18.028446 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.028452 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:18.107167 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.107085 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:18.107325 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:18.107205 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:18.331869 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.331821 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ktbn6" event={"ID":"d600b8d5-af76-4864-85f2-894bc334d737","Type":"ContainerStarted","Data":"f4db8d067d367d213b0672ac34a6ce494a082004ee26709f6b49365f2b8d5cf6"} Apr 16 19:54:18.334372 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.334334 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" event={"ID":"84b73010dde18c2e537db575f397d1b5","Type":"ContainerStarted","Data":"7e60e30cba90af76ced3a926bea3cbf95ed5db8cbc423cfe0c9781ed919ae5d8"} Apr 16 19:54:18.337105 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.337075 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" event={"ID":"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f","Type":"ContainerStarted","Data":"2ca4d0fc135994a34c32032f82a350dee2baedae7bbe2ba8e52dd8fe205ec0b4"} Apr 16 19:54:18.352210 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.352152 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ktbn6" podStartSLOduration=3.746256243 podStartE2EDuration="21.352136559s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.272558521 +0000 UTC m=+1.807424535" lastFinishedPulling="2026-04-16 19:54:15.878438822 +0000 UTC m=+19.413304851" observedRunningTime="2026-04-16 19:54:18.35174145 +0000 UTC m=+21.886607496" watchObservedRunningTime="2026-04-16 19:54:18.352136559 +0000 UTC m=+21.887002599" Apr 16 19:54:18.368933 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:18.368708 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-34.ec2.internal" podStartSLOduration=21.368688097 podStartE2EDuration="21.368688097s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:18.367992912 +0000 UTC m=+21.902858948" watchObservedRunningTime="2026-04-16 19:54:18.368688097 +0000 UTC m=+21.903554138" Apr 16 19:54:19.106671 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:19.106636 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:19.106886 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:19.106784 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:19.341603 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:19.341541 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" event={"ID":"5ad55dd7-a35c-4704-8ecf-446e4cc0c66f","Type":"ContainerStarted","Data":"62b6ac85f496457f403950925423e579466dc44f508fff2c0a7f673f4773d188"} Apr 16 19:54:19.344917 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:19.344886 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"90624590faa9596774f2dc02aaa8fec74c89b3f603b952b1742b583855ded0af"} Apr 16 19:54:19.345230 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:19.345212 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:19.366962 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:19.366870 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwhqd" podStartSLOduration=2.3021766120000002 podStartE2EDuration="22.366852105s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.373820289 +0000 UTC m=+1.908686303" lastFinishedPulling="2026-04-16 19:54:18.438495782 +0000 UTC m=+21.973361796" observedRunningTime="2026-04-16 19:54:19.365663457 +0000 UTC m=+22.900529497" watchObservedRunningTime="2026-04-16 19:54:19.366852105 +0000 UTC m=+22.901718139" Apr 16 19:54:19.841590 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:19.841319 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:54:19.842146 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:19.842115 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9ktg6" Apr 16 19:54:20.106707 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:20.106631 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:20.106900 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:20.106767 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:21.106959 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:21.106876 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:21.107417 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:21.107005 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:21.353354 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:21.353095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" event={"ID":"1a71a363-02b5-43c8-ac58-44ba0eb22832","Type":"ContainerStarted","Data":"3ec1c5917e386644961afebfe82d77cc8843a32180d3fb65a1ebbeb1f799017c"} Apr 16 19:54:21.353354 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:21.353344 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:54:21.353354 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:21.353362 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:54:21.368111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:21.368083 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:54:21.392640 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:21.392560 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" podStartSLOduration=6.703671284 podStartE2EDuration="24.392538002s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.33058548 +0000 UTC m=+1.865451494" lastFinishedPulling="2026-04-16 19:54:16.019452197 +0000 UTC m=+19.554318212" observedRunningTime="2026-04-16 19:54:21.391605538 +0000 UTC m=+24.926471574" watchObservedRunningTime="2026-04-16 19:54:21.392538002 +0000 UTC m=+24.927404053" Apr 16 19:54:21.690267 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:21.690197 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ck5pm_5246af7c-5c30-44f4-9bad-2723e872fdf5/dns-node-resolver/0.log" Apr 16 19:54:22.107360 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:22.107269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:22.107838 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:22.107377 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:22.271130 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:22.271101 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2f4zv_d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b/node-ca/0.log" Apr 16 19:54:22.356227 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:22.356193 2570 generic.go:358] "Generic (PLEG): container finished" podID="405a22ed-e497-47a7-95e5-0362e26a6e43" containerID="7f51551380016c616e161e7ef9a5a8106e39cf58a28e4921f3ce3cc4b0bd99c4" exitCode=0 Apr 16 19:54:22.356367 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:22.356279 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" event={"ID":"405a22ed-e497-47a7-95e5-0362e26a6e43","Type":"ContainerDied","Data":"7f51551380016c616e161e7ef9a5a8106e39cf58a28e4921f3ce3cc4b0bd99c4"} Apr 16 19:54:22.357291 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:22.356854 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:54:22.373351 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:22.373308 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:54:23.107074 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:23.107036 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:23.107209 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:23.107159 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:23.360695 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:23.360469 2570 generic.go:358] "Generic (PLEG): container finished" podID="405a22ed-e497-47a7-95e5-0362e26a6e43" containerID="d1a759b8a5f01051a0d6c28058b31d9decd79864e8954a9a30bb545c54c0f0bc" exitCode=0 Apr 16 19:54:23.360695 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:23.360560 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" event={"ID":"405a22ed-e497-47a7-95e5-0362e26a6e43","Type":"ContainerDied","Data":"d1a759b8a5f01051a0d6c28058b31d9decd79864e8954a9a30bb545c54c0f0bc"} Apr 16 19:54:23.393989 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:23.393957 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8x8wb"] Apr 16 19:54:23.394164 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:23.394147 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:23.394297 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:23.394277 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:23.396918 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:23.396897 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rt6tz"] Apr 16 19:54:23.397024 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:23.396963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:23.397081 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:23.397029 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:24.365000 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:24.364904 2570 generic.go:358] "Generic (PLEG): container finished" podID="405a22ed-e497-47a7-95e5-0362e26a6e43" containerID="e3ed5b9ec61c60974305b03253418e9bfb1a5a13cea0e17275e4f19d0932f43e" exitCode=0 Apr 16 19:54:24.365454 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:24.364998 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" event={"ID":"405a22ed-e497-47a7-95e5-0362e26a6e43","Type":"ContainerDied","Data":"e3ed5b9ec61c60974305b03253418e9bfb1a5a13cea0e17275e4f19d0932f43e"} Apr 16 19:54:25.107204 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:25.107079 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:25.107204 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:25.107110 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:25.107420 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:25.107227 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:25.107670 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:25.107629 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:27.108123 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:27.108029 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:27.108624 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:27.108171 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:27.108624 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:27.108189 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:27.108624 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:27.108343 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:29.107101 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:29.107047 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:29.107555 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:29.107115 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:29.107555 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:29.107216 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:29.107555 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:29.107340 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:29.792353 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:29.792312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:29.792572 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:29.792403 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:29.792572 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:29.792456 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:29.792572 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:29.792483 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:29.792572 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:29.792497 2570 projected.go:194] Error preparing data for projected volume kube-api-access-l96v5 for pod openshift-network-diagnostics/network-check-target-rt6tz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:29.792572 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:29.792505 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:29.792572 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:29.792562 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5 podName:e33890cf-1250-4378-b5a9-3ef264f23dad nodeName:}" failed. No retries permitted until 2026-04-16 19:55:01.792542717 +0000 UTC m=+65.327408734 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-l96v5" (UniqueName: "kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5") pod "network-check-target-rt6tz" (UID: "e33890cf-1250-4378-b5a9-3ef264f23dad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:29.792835 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:29.792583 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs podName:4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:01.792572746 +0000 UTC m=+65.327438763 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs") pod "network-metrics-daemon-8x8wb" (UID: "4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:31.106501 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:31.106457 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:31.106940 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:31.106471 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:31.106940 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:31.106565 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:31.106940 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:31.106689 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:31.381374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:31.381076 2570 generic.go:358] "Generic (PLEG): container finished" podID="405a22ed-e497-47a7-95e5-0362e26a6e43" containerID="81411fdb8487fb0f3aebecdd5a38c262446f1e667c7c215b459f7c4f3da54e22" exitCode=0 Apr 16 19:54:31.381374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:31.381146 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" event={"ID":"405a22ed-e497-47a7-95e5-0362e26a6e43","Type":"ContainerDied","Data":"81411fdb8487fb0f3aebecdd5a38c262446f1e667c7c215b459f7c4f3da54e22"} Apr 16 19:54:32.385564 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:32.385532 2570 generic.go:358] "Generic (PLEG): container finished" podID="405a22ed-e497-47a7-95e5-0362e26a6e43" containerID="a5bd058315d12a16713a38101d57860a78f3fe9e4cbbeb466fec2e46192e72c8" exitCode=0 Apr 16 19:54:32.385942 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:32.385612 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" event={"ID":"405a22ed-e497-47a7-95e5-0362e26a6e43","Type":"ContainerDied","Data":"a5bd058315d12a16713a38101d57860a78f3fe9e4cbbeb466fec2e46192e72c8"} Apr 16 19:54:33.107358 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:33.107257 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:33.107519 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:33.107263 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:33.107519 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:33.107406 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:33.107519 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:33.107468 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:33.391114 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:33.391023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" event={"ID":"405a22ed-e497-47a7-95e5-0362e26a6e43","Type":"ContainerStarted","Data":"ef8ae1974e1d6baed68d433e42429d4b371ab28f133d0d2fff0dd7b755c832dd"} Apr 16 19:54:33.420269 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:33.420212 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ljvqz" podStartSLOduration=4.56854748 podStartE2EDuration="36.420198187s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.413336819 +0000 UTC m=+1.948202832" lastFinishedPulling="2026-04-16 19:54:30.264987523 +0000 UTC m=+33.799853539" observedRunningTime="2026-04-16 19:54:33.419625908 +0000 UTC m=+36.954491945" watchObservedRunningTime="2026-04-16 19:54:33.420198187 +0000 UTC m=+36.955064283" Apr 16 19:54:35.107234 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:35.107196 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:35.107634 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:35.107203 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:35.107634 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:35.107312 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:35.107634 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:35.107448 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:37.107411 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:37.107377 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:37.107787 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:37.107464 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:37.107787 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:37.107546 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:37.107787 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:37.107636 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:39.106566 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:39.106532 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:39.106566 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:39.106553 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:39.107077 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:39.106661 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:39.107077 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:39.106803 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:41.107335 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:41.107304 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:41.107766 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:41.107350 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:41.107766 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:41.107425 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:41.107766 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:41.107536 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:43.106486 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:43.106446 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:43.106956 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:43.106559 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:43.106956 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:43.106617 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:43.106956 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:43.106740 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:45.106620 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.106582 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:45.107178 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:45.106705 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rt6tz" podUID="e33890cf-1250-4378-b5a9-3ef264f23dad" Apr 16 19:54:45.107178 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.106582 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:45.107178 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:45.106904 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8x8wb" podUID="4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1" Apr 16 19:54:45.775680 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.775651 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-34.ec2.internal" event="NodeReady" Apr 16 19:54:45.775849 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.775779 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:45.859039 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.859004 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kql9s"] Apr 16 19:54:45.873671 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.873637 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2drhq"] Apr 16 19:54:45.873820 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.873773 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kql9s" Apr 16 19:54:45.876975 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.876946 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:45.877117 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.877027 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7n2jf\"" Apr 16 19:54:45.877718 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.877701 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:45.878447 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.878428 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:45.896148 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.896119 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kql9s"] Apr 16 19:54:45.896148 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.896146 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2drhq"] Apr 16 19:54:45.896323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.896256 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:45.900849 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.900827 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:54:45.902002 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.901983 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:54:45.902148 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.902131 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8tkc8\"" Apr 16 19:54:45.902361 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.902345 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:54:45.902447 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.902367 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:54:45.965638 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.965602 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nw756"] Apr 16 19:54:45.991951 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.991921 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nw756"] Apr 16 19:54:45.992111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.992079 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nw756" Apr 16 19:54:45.997089 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.997048 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:45.997564 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.997541 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:45.997739 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:45.997720 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8flcv\"" Apr 16 19:54:46.007494 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.007469 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/897e1deb-27fc-453c-ab4d-8c48687f74c4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.007594 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.007513 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffldr\" (UniqueName: \"kubernetes.io/projected/897e1deb-27fc-453c-ab4d-8c48687f74c4-kube-api-access-ffldr\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.007594 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.007566 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/897e1deb-27fc-453c-ab4d-8c48687f74c4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.007664 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.007609 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phx5j\" (UniqueName: \"kubernetes.io/projected/1d5abd6b-52ed-4fb7-997a-abcbe592b7af-kube-api-access-phx5j\") pod \"ingress-canary-kql9s\" (UID: \"1d5abd6b-52ed-4fb7-997a-abcbe592b7af\") " pod="openshift-ingress-canary/ingress-canary-kql9s" Apr 16 19:54:46.007664 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.007642 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/897e1deb-27fc-453c-ab4d-8c48687f74c4-data-volume\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.007727 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.007675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/897e1deb-27fc-453c-ab4d-8c48687f74c4-crio-socket\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.007727 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.007701 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5abd6b-52ed-4fb7-997a-abcbe592b7af-cert\") pod \"ingress-canary-kql9s\" (UID: \"1d5abd6b-52ed-4fb7-997a-abcbe592b7af\") " pod="openshift-ingress-canary/ingress-canary-kql9s" Apr 16 19:54:46.108921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.108833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/897e1deb-27fc-453c-ab4d-8c48687f74c4-crio-socket\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.108921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.108873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5abd6b-52ed-4fb7-997a-abcbe592b7af-cert\") pod \"ingress-canary-kql9s\" (UID: \"1d5abd6b-52ed-4fb7-997a-abcbe592b7af\") " pod="openshift-ingress-canary/ingress-canary-kql9s" Apr 16 19:54:46.108921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.108894 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/133eb9cd-8eb0-4366-869a-c12b276770b4-tmp-dir\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.108921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.108917 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/897e1deb-27fc-453c-ab4d-8c48687f74c4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.108942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/133eb9cd-8eb0-4366-869a-c12b276770b4-config-volume\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.108981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffldr\" (UniqueName: \"kubernetes.io/projected/897e1deb-27fc-453c-ab4d-8c48687f74c4-kube-api-access-ffldr\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.109003 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/897e1deb-27fc-453c-ab4d-8c48687f74c4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.109028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133eb9cd-8eb0-4366-869a-c12b276770b4-metrics-tls\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.109078 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62wr\" (UniqueName: \"kubernetes.io/projected/133eb9cd-8eb0-4366-869a-c12b276770b4-kube-api-access-l62wr\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.109123 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phx5j\" (UniqueName: \"kubernetes.io/projected/1d5abd6b-52ed-4fb7-997a-abcbe592b7af-kube-api-access-phx5j\") pod \"ingress-canary-kql9s\" (UID: \"1d5abd6b-52ed-4fb7-997a-abcbe592b7af\") " pod="openshift-ingress-canary/ingress-canary-kql9s" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.109121 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/897e1deb-27fc-453c-ab4d-8c48687f74c4-crio-socket\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.109198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/897e1deb-27fc-453c-ab4d-8c48687f74c4-data-volume\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.109580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.109478 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/897e1deb-27fc-453c-ab4d-8c48687f74c4-data-volume\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.109917 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.109619 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/897e1deb-27fc-453c-ab4d-8c48687f74c4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.113072 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.113036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/897e1deb-27fc-453c-ab4d-8c48687f74c4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.113153 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.113044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5abd6b-52ed-4fb7-997a-abcbe592b7af-cert\") pod \"ingress-canary-kql9s\" (UID: \"1d5abd6b-52ed-4fb7-997a-abcbe592b7af\") " pod="openshift-ingress-canary/ingress-canary-kql9s" Apr 16 19:54:46.122940 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.122913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phx5j\" (UniqueName: \"kubernetes.io/projected/1d5abd6b-52ed-4fb7-997a-abcbe592b7af-kube-api-access-phx5j\") pod \"ingress-canary-kql9s\" (UID: \"1d5abd6b-52ed-4fb7-997a-abcbe592b7af\") " pod="openshift-ingress-canary/ingress-canary-kql9s" Apr 16 19:54:46.123198 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.123178 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffldr\" (UniqueName: \"kubernetes.io/projected/897e1deb-27fc-453c-ab4d-8c48687f74c4-kube-api-access-ffldr\") pod \"insights-runtime-extractor-2drhq\" (UID: \"897e1deb-27fc-453c-ab4d-8c48687f74c4\") " pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.182667 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.182627 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kql9s" Apr 16 19:54:46.205479 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.205439 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2drhq" Apr 16 19:54:46.210328 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.210298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/133eb9cd-8eb0-4366-869a-c12b276770b4-tmp-dir\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.210477 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.210337 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/133eb9cd-8eb0-4366-869a-c12b276770b4-config-volume\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.210477 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.210372 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133eb9cd-8eb0-4366-869a-c12b276770b4-metrics-tls\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.210585 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.210504 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l62wr\" (UniqueName: \"kubernetes.io/projected/133eb9cd-8eb0-4366-869a-c12b276770b4-kube-api-access-l62wr\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.210720 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.210701 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/133eb9cd-8eb0-4366-869a-c12b276770b4-tmp-dir\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.210952 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.210924 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/133eb9cd-8eb0-4366-869a-c12b276770b4-config-volume\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.212757 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.212738 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133eb9cd-8eb0-4366-869a-c12b276770b4-metrics-tls\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.221349 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.221320 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62wr\" (UniqueName: \"kubernetes.io/projected/133eb9cd-8eb0-4366-869a-c12b276770b4-kube-api-access-l62wr\") pod \"dns-default-nw756\" (UID: \"133eb9cd-8eb0-4366-869a-c12b276770b4\") " pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.300452 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.300424 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nw756" Apr 16 19:54:46.353625 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.353591 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2drhq"] Apr 16 19:54:46.357915 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:46.357886 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod897e1deb_27fc_453c_ab4d_8c48687f74c4.slice/crio-b6b2d7dd9b5d684ad65a7927874b57aae12231ee1e08e44b43bb5ee23dae2a02 WatchSource:0}: Error finding container b6b2d7dd9b5d684ad65a7927874b57aae12231ee1e08e44b43bb5ee23dae2a02: Status 404 returned error can't find the container with id b6b2d7dd9b5d684ad65a7927874b57aae12231ee1e08e44b43bb5ee23dae2a02 Apr 16 19:54:46.379741 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.376856 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kql9s"] Apr 16 19:54:46.418663 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.418624 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2drhq" event={"ID":"897e1deb-27fc-453c-ab4d-8c48687f74c4","Type":"ContainerStarted","Data":"b6b2d7dd9b5d684ad65a7927874b57aae12231ee1e08e44b43bb5ee23dae2a02"} Apr 16 19:54:46.419667 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.419641 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kql9s" event={"ID":"1d5abd6b-52ed-4fb7-997a-abcbe592b7af","Type":"ContainerStarted","Data":"3251b4d31e310d7489fac07b886ac029cfcf992b4f22c445849d426f0cfda2b9"} Apr 16 19:54:46.455728 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:46.455696 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nw756"] Apr 16 19:54:46.460098 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:46.460050 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133eb9cd_8eb0_4366_869a_c12b276770b4.slice/crio-30a030f8b3a96fcdf93d162ce544a4a1bbb674644f52c01bd4055e5a783478a6 WatchSource:0}: Error finding container 30a030f8b3a96fcdf93d162ce544a4a1bbb674644f52c01bd4055e5a783478a6: Status 404 returned error can't find the container with id 30a030f8b3a96fcdf93d162ce544a4a1bbb674644f52c01bd4055e5a783478a6 Apr 16 19:54:47.111501 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.111468 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:54:47.112188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.111852 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:54:47.116327 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.115947 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:47.116327 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.116004 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:47.116327 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.116046 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2xst7\"" Apr 16 19:54:47.116327 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.115946 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:47.116327 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.116249 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p7kgs\"" Apr 16 19:54:47.423867 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.423765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nw756" event={"ID":"133eb9cd-8eb0-4366-869a-c12b276770b4","Type":"ContainerStarted","Data":"30a030f8b3a96fcdf93d162ce544a4a1bbb674644f52c01bd4055e5a783478a6"} Apr 16 19:54:47.425902 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.425853 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2drhq" event={"ID":"897e1deb-27fc-453c-ab4d-8c48687f74c4","Type":"ContainerStarted","Data":"4aa5d6175b234e24b3755454da93428eb86c77c0b74c070be016bcfbf23987c2"} Apr 16 19:54:47.426043 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:47.425910 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2drhq" event={"ID":"897e1deb-27fc-453c-ab4d-8c48687f74c4","Type":"ContainerStarted","Data":"404e0076f43c76d2762b44012525ed0a9766115aad526e1a7fcad0790f3cfa26"} Apr 16 19:54:49.270539 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.270507 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-cblsl"] Apr 16 19:54:49.273323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.273305 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.278109 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.278068 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 19:54:49.278109 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.278077 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:54:49.278301 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.278113 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:54:49.278301 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.278113 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nxkn6\"" Apr 16 19:54:49.278301 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.278077 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:54:49.278301 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.278185 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 19:54:49.292347 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.292318 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-cblsl"] Apr 16 19:54:49.432829 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.432792 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2drhq" event={"ID":"897e1deb-27fc-453c-ab4d-8c48687f74c4","Type":"ContainerStarted","Data":"218a4ec01949db6d945ab502a85e24a4b636a2591afeac5309a1b38bb9df185e"} Apr 16 19:54:49.434167 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.434136 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kql9s" event={"ID":"1d5abd6b-52ed-4fb7-997a-abcbe592b7af","Type":"ContainerStarted","Data":"8fd2e0e55945f2fce8dbdddca6d8f8e5c97562dcda1a1d92666637765f650971"} Apr 16 19:54:49.435730 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.435705 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nw756" event={"ID":"133eb9cd-8eb0-4366-869a-c12b276770b4","Type":"ContainerStarted","Data":"d37b27a38ca18b2dce6ab4213f7f70c89a6caf17877d9843e595c453cd98e7ba"} Apr 16 19:54:49.435730 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.435732 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nw756" event={"ID":"133eb9cd-8eb0-4366-869a-c12b276770b4","Type":"ContainerStarted","Data":"ca6f8e83264a06fc7d925e097bc4471f6e22072f6cead2e19172c82389a65f94"} Apr 16 19:54:49.435929 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.435827 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nw756" Apr 16 19:54:49.437297 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.437276 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a2f4e5e-93ed-4a68-975d-93771007f8ff-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.437396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.437334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a2f4e5e-93ed-4a68-975d-93771007f8ff-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.437449 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.437405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a2f4e5e-93ed-4a68-975d-93771007f8ff-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.437449 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.437431 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86b9\" (UniqueName: \"kubernetes.io/projected/6a2f4e5e-93ed-4a68-975d-93771007f8ff-kube-api-access-k86b9\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.455781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.455707 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2drhq" podStartSLOduration=2.121862833 podStartE2EDuration="4.455691408s" podCreationTimestamp="2026-04-16 19:54:45 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.546355058 +0000 UTC m=+50.081221072" lastFinishedPulling="2026-04-16 19:54:48.880183623 +0000 UTC m=+52.415049647" observedRunningTime="2026-04-16 19:54:49.455322001 +0000 UTC m=+52.990188039" watchObservedRunningTime="2026-04-16 19:54:49.455691408 +0000 UTC m=+52.990557445" Apr 16 19:54:49.476192 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.476137 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nw756" podStartSLOduration=2.111041502 podStartE2EDuration="4.476117914s" podCreationTimestamp="2026-04-16 19:54:45 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.461873195 +0000 UTC m=+49.996739209" lastFinishedPulling="2026-04-16 19:54:48.826949602 +0000 UTC m=+52.361815621" observedRunningTime="2026-04-16 19:54:49.475686957 +0000 UTC m=+53.010552991" watchObservedRunningTime="2026-04-16 19:54:49.476117914 +0000 UTC m=+53.010983951" Apr 16 19:54:49.496957 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.496909 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kql9s" podStartSLOduration=2.06981603 podStartE2EDuration="4.496894405s" podCreationTimestamp="2026-04-16 19:54:45 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.399953902 +0000 UTC m=+49.934819929" lastFinishedPulling="2026-04-16 19:54:48.827032286 +0000 UTC m=+52.361898304" observedRunningTime="2026-04-16 19:54:49.495666496 +0000 UTC m=+53.030532539" watchObservedRunningTime="2026-04-16 19:54:49.496894405 +0000 UTC m=+53.031760441" Apr 16 19:54:49.538881 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.538780 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a2f4e5e-93ed-4a68-975d-93771007f8ff-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.538881 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.538858 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a2f4e5e-93ed-4a68-975d-93771007f8ff-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.539704 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.539467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k86b9\" (UniqueName: \"kubernetes.io/projected/6a2f4e5e-93ed-4a68-975d-93771007f8ff-kube-api-access-k86b9\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.539704 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.539591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a2f4e5e-93ed-4a68-975d-93771007f8ff-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.539704 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.539598 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a2f4e5e-93ed-4a68-975d-93771007f8ff-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.541870 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.541838 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a2f4e5e-93ed-4a68-975d-93771007f8ff-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.542067 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.542037 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a2f4e5e-93ed-4a68-975d-93771007f8ff-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.549891 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.549866 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86b9\" (UniqueName: \"kubernetes.io/projected/6a2f4e5e-93ed-4a68-975d-93771007f8ff-kube-api-access-k86b9\") pod \"prometheus-operator-5676c8c784-cblsl\" (UID: \"6a2f4e5e-93ed-4a68-975d-93771007f8ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.582396 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.582361 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" Apr 16 19:54:49.710790 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:49.710759 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-cblsl"] Apr 16 19:54:49.713515 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:49.713487 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a2f4e5e_93ed_4a68_975d_93771007f8ff.slice/crio-94352043c93f0f693e1b324953da165fed45e7b58af57d26ca56e5ffeee3bb8b WatchSource:0}: Error finding container 94352043c93f0f693e1b324953da165fed45e7b58af57d26ca56e5ffeee3bb8b: Status 404 returned error can't find the container with id 94352043c93f0f693e1b324953da165fed45e7b58af57d26ca56e5ffeee3bb8b Apr 16 19:54:50.440106 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.440033 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" event={"ID":"6a2f4e5e-93ed-4a68-975d-93771007f8ff","Type":"ContainerStarted","Data":"94352043c93f0f693e1b324953da165fed45e7b58af57d26ca56e5ffeee3bb8b"} Apr 16 19:54:50.691873 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.691780 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6984c5ccbb-99m8p"] Apr 16 19:54:50.695048 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.695015 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.697814 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.697788 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 19:54:50.699420 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.699152 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 19:54:50.699420 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.699181 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:54:50.699420 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.699209 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qnsx7\"" Apr 16 19:54:50.699420 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.699235 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 19:54:50.699420 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.699279 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 19:54:50.699420 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.699215 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:54:50.699420 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.699162 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 19:54:50.709667 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.709634 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6984c5ccbb-99m8p"] Apr 16 19:54:50.850133 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.850099 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-oauth-config\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.850298 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.850142 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-console-config\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.850298 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.850176 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-oauth-serving-cert\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.850418 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.850309 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-service-ca\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.850418 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.850366 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7m4c\" (UniqueName: \"kubernetes.io/projected/18958238-c99b-4a8f-94f1-659781539d41-kube-api-access-j7m4c\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.850418 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.850402 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-serving-cert\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.951818 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.951716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7m4c\" (UniqueName: \"kubernetes.io/projected/18958238-c99b-4a8f-94f1-659781539d41-kube-api-access-j7m4c\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.951818 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.951787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-serving-cert\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.951818 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.951825 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-oauth-config\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.952073 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.951864 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-console-config\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.952073 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.951898 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-oauth-serving-cert\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.952073 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.951947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-service-ca\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.953220 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.953191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-oauth-serving-cert\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.953378 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.953357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-service-ca\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.953417 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.953357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-console-config\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.954231 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.954212 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-oauth-config\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.954304 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.954288 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-serving-cert\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:50.961895 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:50.961874 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7m4c\" (UniqueName: \"kubernetes.io/projected/18958238-c99b-4a8f-94f1-659781539d41-kube-api-access-j7m4c\") pod \"console-6984c5ccbb-99m8p\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:51.006800 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:51.006750 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:54:51.151143 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:51.151105 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6984c5ccbb-99m8p"] Apr 16 19:54:51.412752 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:51.412721 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18958238_c99b_4a8f_94f1_659781539d41.slice/crio-b30fe855afbf0de108bac8794508bbdda134d0f59fa3f3bfa11249045caaa398 WatchSource:0}: Error finding container b30fe855afbf0de108bac8794508bbdda134d0f59fa3f3bfa11249045caaa398: Status 404 returned error can't find the container with id b30fe855afbf0de108bac8794508bbdda134d0f59fa3f3bfa11249045caaa398 Apr 16 19:54:51.443096 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:51.443042 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6984c5ccbb-99m8p" event={"ID":"18958238-c99b-4a8f-94f1-659781539d41","Type":"ContainerStarted","Data":"b30fe855afbf0de108bac8794508bbdda134d0f59fa3f3bfa11249045caaa398"} Apr 16 19:54:52.447754 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:52.447701 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" event={"ID":"6a2f4e5e-93ed-4a68-975d-93771007f8ff","Type":"ContainerStarted","Data":"b64fe20c27c3e3322c849e92cecfc10edc0955b6ba6ba74257665f4647f63920"} Apr 16 19:54:52.447754 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:52.447746 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" event={"ID":"6a2f4e5e-93ed-4a68-975d-93771007f8ff","Type":"ContainerStarted","Data":"bc862b7b71da672dea5e2731659267feb5e8d5c2ab6114231ce2742b5779eace"} Apr 16 19:54:54.375965 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.375934 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5tmb" Apr 16 19:54:54.419044 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.418994 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-cblsl" podStartSLOduration=3.678351809 podStartE2EDuration="5.418973161s" podCreationTimestamp="2026-04-16 19:54:49 +0000 UTC" firstStartedPulling="2026-04-16 19:54:49.715314624 +0000 UTC m=+53.250180638" lastFinishedPulling="2026-04-16 19:54:51.455935971 +0000 UTC m=+54.990801990" observedRunningTime="2026-04-16 19:54:52.473317144 +0000 UTC m=+56.008183181" watchObservedRunningTime="2026-04-16 19:54:54.418973161 +0000 UTC m=+57.953839199" Apr 16 19:54:54.695374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.695288 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp"] Apr 16 19:54:54.698481 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.698464 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.701488 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.701464 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:54:54.701742 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.701725 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-dqqjn\"" Apr 16 19:54:54.703333 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.703318 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 19:54:54.710973 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.710950 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nmzrv"] Apr 16 19:54:54.713913 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.713897 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.719969 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.719935 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp"] Apr 16 19:54:54.720519 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.720502 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:54:54.720866 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.720839 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:54:54.720953 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.720865 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:54:54.721370 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.721356 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rv4cw\"" Apr 16 19:54:54.748228 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.748196 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wx7xc"] Apr 16 19:54:54.752364 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.752337 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.760484 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.760462 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:54:54.760595 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.760541 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 19:54:54.760851 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.760837 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-qt584\"" Apr 16 19:54:54.761249 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.761235 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 19:54:54.773246 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.773218 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wx7xc"] Apr 16 19:54:54.777107 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777083 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.777201 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777117 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.777201 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777144 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdmth\" (UniqueName: \"kubernetes.io/projected/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-api-access-sdmth\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.777201 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777171 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-wtmp\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777311 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777211 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.777311 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777262 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777311 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777285 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.777401 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777310 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-sys\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777401 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777369 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-textfile\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777401 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777393 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777484 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777418 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.777484 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777444 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2fc1956-9cb0-45a4-8b02-f764c61c9655-metrics-client-ca\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777550 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777481 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93af7b8d-bce4-432e-885a-c48d5e8895fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.777550 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777530 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-root\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777628 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-tls\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777628 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777577 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93af7b8d-bce4-432e-885a-c48d5e8895fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.777628 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777606 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gfk7\" (UniqueName: \"kubernetes.io/projected/a2fc1956-9cb0-45a4-8b02-f764c61c9655-kube-api-access-5gfk7\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.777723 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777625 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.777723 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.777651 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7wnc\" (UniqueName: \"kubernetes.io/projected/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-kube-api-access-n7wnc\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.878170 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.878170 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7wnc\" (UniqueName: \"kubernetes.io/projected/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-kube-api-access-n7wnc\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.878427 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.878427 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878220 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.878427 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878295 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdmth\" (UniqueName: \"kubernetes.io/projected/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-api-access-sdmth\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.878427 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878339 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-wtmp\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.878427 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878373 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.878661 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.878661 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878519 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-wtmp\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.878817 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.878986 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878868 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-sys\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.878986 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.878960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-sys\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879151 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879040 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-textfile\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879151 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879151 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.879151 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879359 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879162 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2fc1956-9cb0-45a4-8b02-f764c61c9655-metrics-client-ca\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879359 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879236 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.879359 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879323 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-textfile\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879359 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:54.879331 2570 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 19:54:54.879580 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:54.879389 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-tls podName:5193f0a6-8fbd-4c14-b92c-1eae57fc248b nodeName:}" failed. No retries permitted until 2026-04-16 19:54:55.379371076 +0000 UTC m=+58.914237093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-f47kp" (UID: "5193f0a6-8fbd-4c14-b92c-1eae57fc248b") : secret "openshift-state-metrics-tls" not found Apr 16 19:54:54.879580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879448 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93af7b8d-bce4-432e-885a-c48d5e8895fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.879580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-root\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-tls\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879580 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93af7b8d-bce4-432e-885a-c48d5e8895fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.879832 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879620 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a2fc1956-9cb0-45a4-8b02-f764c61c9655-root\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879832 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879646 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2fc1956-9cb0-45a4-8b02-f764c61c9655-metrics-client-ca\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.879832 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.879659 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gfk7\" (UniqueName: \"kubernetes.io/projected/a2fc1956-9cb0-45a4-8b02-f764c61c9655-kube-api-access-5gfk7\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.880115 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.880093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93af7b8d-bce4-432e-885a-c48d5e8895fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.880277 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.880157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93af7b8d-bce4-432e-885a-c48d5e8895fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.880277 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.880188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.881299 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.881276 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:54.881384 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.881279 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.881384 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.881324 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.881491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.881474 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.882369 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.882354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a2fc1956-9cb0-45a4-8b02-f764c61c9655-node-exporter-tls\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.894416 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.894386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gfk7\" (UniqueName: \"kubernetes.io/projected/a2fc1956-9cb0-45a4-8b02-f764c61c9655-kube-api-access-5gfk7\") pod \"node-exporter-nmzrv\" (UID: \"a2fc1956-9cb0-45a4-8b02-f764c61c9655\") " pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:54.894651 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.894632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdmth\" (UniqueName: \"kubernetes.io/projected/93af7b8d-bce4-432e-885a-c48d5e8895fa-kube-api-access-sdmth\") pod \"kube-state-metrics-69db897b98-wx7xc\" (UID: \"93af7b8d-bce4-432e-885a-c48d5e8895fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:54.894928 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:54.894907 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7wnc\" (UniqueName: \"kubernetes.io/projected/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-kube-api-access-n7wnc\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:55.022861 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.022826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nmzrv" Apr 16 19:54:55.031342 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:55.031314 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2fc1956_9cb0_45a4_8b02_f764c61c9655.slice/crio-64468348ee91f794a6696117e9ac94736e7f3d41e5ab668953bdcf9c2eb81332 WatchSource:0}: Error finding container 64468348ee91f794a6696117e9ac94736e7f3d41e5ab668953bdcf9c2eb81332: Status 404 returned error can't find the container with id 64468348ee91f794a6696117e9ac94736e7f3d41e5ab668953bdcf9c2eb81332 Apr 16 19:54:55.061109 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.061048 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" Apr 16 19:54:55.186995 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.186963 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wx7xc"] Apr 16 19:54:55.190811 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:55.190783 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93af7b8d_bce4_432e_885a_c48d5e8895fa.slice/crio-2be3fc0109b5e8e1cbb9b25eed37bed0532153c70b4cf62c8abce1207f70084c WatchSource:0}: Error finding container 2be3fc0109b5e8e1cbb9b25eed37bed0532153c70b4cf62c8abce1207f70084c: Status 404 returned error can't find the container with id 2be3fc0109b5e8e1cbb9b25eed37bed0532153c70b4cf62c8abce1207f70084c Apr 16 19:54:55.384645 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.384564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:55.386908 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.386886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5193f0a6-8fbd-4c14-b92c-1eae57fc248b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-f47kp\" (UID: \"5193f0a6-8fbd-4c14-b92c-1eae57fc248b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:55.459890 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.459850 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" event={"ID":"93af7b8d-bce4-432e-885a-c48d5e8895fa","Type":"ContainerStarted","Data":"2be3fc0109b5e8e1cbb9b25eed37bed0532153c70b4cf62c8abce1207f70084c"} Apr 16 19:54:55.460811 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.460781 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmzrv" event={"ID":"a2fc1956-9cb0-45a4-8b02-f764c61c9655","Type":"ContainerStarted","Data":"64468348ee91f794a6696117e9ac94736e7f3d41e5ab668953bdcf9c2eb81332"} Apr 16 19:54:55.462024 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.462000 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6984c5ccbb-99m8p" event={"ID":"18958238-c99b-4a8f-94f1-659781539d41","Type":"ContainerStarted","Data":"6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89"} Apr 16 19:54:55.479153 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.479103 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6984c5ccbb-99m8p" podStartSLOduration=2.473273812 podStartE2EDuration="5.479089323s" podCreationTimestamp="2026-04-16 19:54:50 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.414643813 +0000 UTC m=+54.949509830" lastFinishedPulling="2026-04-16 19:54:54.420459327 +0000 UTC m=+57.955325341" observedRunningTime="2026-04-16 19:54:55.47840872 +0000 UTC m=+59.013274792" watchObservedRunningTime="2026-04-16 19:54:55.479089323 +0000 UTC m=+59.013955359" Apr 16 19:54:55.607175 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.607142 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" Apr 16 19:54:55.741588 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.741554 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:54:55.746086 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.746049 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.749402 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.749347 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:54:55.749919 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.749743 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:54:55.749919 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.749781 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:54:55.749919 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.749850 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:54:55.750124 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.750092 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:54:55.750124 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.750099 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8qhhm\"" Apr 16 19:54:55.750189 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.750142 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:54:55.750219 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.750098 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:54:55.750353 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.750336 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:54:55.750427 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.750411 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:54:55.753509 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.753455 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp"] Apr 16 19:54:55.776487 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.776091 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:54:55.788140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ccx\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-kube-api-access-22ccx\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788309 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788149 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788309 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788309 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788195 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788309 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788253 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788309 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788281 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788309 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788307 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788610 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788352 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-out\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788610 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788419 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-volume\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788610 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788446 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788610 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788610 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788501 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-web-config\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.788610 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.788575 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.863740 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:55.863699 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5193f0a6_8fbd_4c14_b92c_1eae57fc248b.slice/crio-05adfd4a65c4a4483bc84a211812c5fee2bd2a84609006825491f7a0aaf0695c WatchSource:0}: Error finding container 05adfd4a65c4a4483bc84a211812c5fee2bd2a84609006825491f7a0aaf0695c: Status 404 returned error can't find the container with id 05adfd4a65c4a4483bc84a211812c5fee2bd2a84609006825491f7a0aaf0695c Apr 16 19:54:55.889893 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.889862 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890049 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.889932 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-out\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890049 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.889973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-volume\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890049 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.889996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890049 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890024 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890049 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890071 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-web-config\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890339 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890123 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890339 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22ccx\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-kube-api-access-22ccx\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890339 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890194 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890339 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:55.890204 2570 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 19:54:55.890339 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890227 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890339 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890252 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890339 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:55.890282 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls podName:bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:56.390257243 +0000 UTC m=+59.925123277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6") : secret "alertmanager-main-tls" not found Apr 16 19:54:55.890339 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890331 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890591 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890366 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.890591 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:54:55.890395 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle podName:bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:56.390383546 +0000 UTC m=+59.925249569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6") : configmap references non-existent config key: ca-bundle.crt Apr 16 19:54:55.890929 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890809 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.891009 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.890977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.894279 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.894151 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-out\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.894279 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.894161 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.894279 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.894214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.894279 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.894215 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.894601 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.894578 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.894956 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.894932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-web-config\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.895154 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.895136 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-volume\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.895490 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.895457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:55.899528 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:55.899502 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ccx\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-kube-api-access-22ccx\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:56.393435 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.393393 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:56.393942 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.393493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:56.394407 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.394379 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:56.396226 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.396203 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:56.468426 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.468382 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" event={"ID":"5193f0a6-8fbd-4c14-b92c-1eae57fc248b","Type":"ContainerStarted","Data":"e0b878f256a23144aa269044ab3eb3c03ae5fee6333c2d854de360a9d4aaff24"} Apr 16 19:54:56.468599 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.468431 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" event={"ID":"5193f0a6-8fbd-4c14-b92c-1eae57fc248b","Type":"ContainerStarted","Data":"8531aee78e263b7908d11a2e72b6f1f011267630caee6accc9f4f07840fa0097"} Apr 16 19:54:56.468599 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.468449 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" event={"ID":"5193f0a6-8fbd-4c14-b92c-1eae57fc248b","Type":"ContainerStarted","Data":"05adfd4a65c4a4483bc84a211812c5fee2bd2a84609006825491f7a0aaf0695c"} Apr 16 19:54:56.470016 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.469989 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2fc1956-9cb0-45a4-8b02-f764c61c9655" containerID="90ba1cd0542da13dc2d5b66e44864388e1b796da8664425684cfd5b7aed868e7" exitCode=0 Apr 16 19:54:56.470142 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.470090 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmzrv" event={"ID":"a2fc1956-9cb0-45a4-8b02-f764c61c9655","Type":"ContainerDied","Data":"90ba1cd0542da13dc2d5b66e44864388e1b796da8664425684cfd5b7aed868e7"} Apr 16 19:54:56.657863 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.657778 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:54:56.823091 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:56.822950 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:54:56.826677 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:56.826648 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf9b856_3ec2_4edd_beaa_0aebf8bb41f6.slice/crio-f23350fd934034fe054e43694b893dd641bc60aadf38783626814e1b3e099406 WatchSource:0}: Error finding container f23350fd934034fe054e43694b893dd641bc60aadf38783626814e1b3e099406: Status 404 returned error can't find the container with id f23350fd934034fe054e43694b893dd641bc60aadf38783626814e1b3e099406 Apr 16 19:54:57.474368 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.474334 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerStarted","Data":"f23350fd934034fe054e43694b893dd641bc60aadf38783626814e1b3e099406"} Apr 16 19:54:57.476660 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.476584 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" event={"ID":"93af7b8d-bce4-432e-885a-c48d5e8895fa","Type":"ContainerStarted","Data":"dc8294fcac16cb98a74723b75b547b1a1cee45594b992e48880a2194d1ef7eb4"} Apr 16 19:54:57.476660 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.476622 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" event={"ID":"93af7b8d-bce4-432e-885a-c48d5e8895fa","Type":"ContainerStarted","Data":"741869222ba46202c18fa251f508f714ad92eeade861e310ef359f6ac5b831ae"} Apr 16 19:54:57.476660 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.476636 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" event={"ID":"93af7b8d-bce4-432e-885a-c48d5e8895fa","Type":"ContainerStarted","Data":"a96dfe1ef7ef1ef1bb03d5bd84950ec01e8b6eb7415bbfae8332ff5064fab8e1"} Apr 16 19:54:57.478739 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.478717 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmzrv" event={"ID":"a2fc1956-9cb0-45a4-8b02-f764c61c9655","Type":"ContainerStarted","Data":"2405d927976d80fc54058f718644d0c70023221c018a95945c164978153f066b"} Apr 16 19:54:57.478849 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.478744 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmzrv" event={"ID":"a2fc1956-9cb0-45a4-8b02-f764c61c9655","Type":"ContainerStarted","Data":"acd09c4c13f0f8da1db7fb77b596fb4a9149bea133f4a1b2762fd86052d0576d"} Apr 16 19:54:57.516456 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.516397 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nmzrv" podStartSLOduration=2.637353469 podStartE2EDuration="3.516378499s" podCreationTimestamp="2026-04-16 19:54:54 +0000 UTC" firstStartedPulling="2026-04-16 19:54:55.033008307 +0000 UTC m=+58.567874321" lastFinishedPulling="2026-04-16 19:54:55.912033333 +0000 UTC m=+59.446899351" observedRunningTime="2026-04-16 19:54:57.516078602 +0000 UTC m=+61.050944638" watchObservedRunningTime="2026-04-16 19:54:57.516378499 +0000 UTC m=+61.051244537" Apr 16 19:54:57.516863 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.516832 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-wx7xc" podStartSLOduration=2.151921822 podStartE2EDuration="3.516823324s" podCreationTimestamp="2026-04-16 19:54:54 +0000 UTC" firstStartedPulling="2026-04-16 19:54:55.192786983 +0000 UTC m=+58.727652997" lastFinishedPulling="2026-04-16 19:54:56.557688484 +0000 UTC m=+60.092554499" observedRunningTime="2026-04-16 19:54:57.494602814 +0000 UTC m=+61.029468850" watchObservedRunningTime="2026-04-16 19:54:57.516823324 +0000 UTC m=+61.051689362" Apr 16 19:54:57.656272 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.656240 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6896fd559d-s8tfj"] Apr 16 19:54:57.659671 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.659653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.662720 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.662696 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 19:54:57.662831 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.662778 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 19:54:57.662921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.662905 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 19:54:57.662986 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.662928 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 19:54:57.663041 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.663012 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 19:54:57.663157 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.663116 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-1uhaqhnhn0ntt\"" Apr 16 19:54:57.663157 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.663125 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-sh5b8\"" Apr 16 19:54:57.673286 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.673264 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6896fd559d-s8tfj"] Apr 16 19:54:57.708229 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.708199 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.708229 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.708236 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.708446 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.708268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.708446 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.708346 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-grpc-tls\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.708446 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.708378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.708446 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.708412 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4848w\" (UniqueName: \"kubernetes.io/projected/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-kube-api-access-4848w\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.708602 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.708452 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-tls\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.708602 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.708484 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-metrics-client-ca\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.809362 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.809320 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-grpc-tls\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.809362 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.809363 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.809626 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.809409 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4848w\" (UniqueName: \"kubernetes.io/projected/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-kube-api-access-4848w\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.809626 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.809439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-tls\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.809626 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.809468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-metrics-client-ca\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.809626 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.809528 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.809626 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.809560 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.809626 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.809589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.810338 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.810314 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-metrics-client-ca\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.812738 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.812649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.812738 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.812660 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-tls\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.812738 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.812732 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.812981 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.812959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.813555 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.813520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-grpc-tls\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.814446 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.814420 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.820313 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.820285 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4848w\" (UniqueName: \"kubernetes.io/projected/fa467cab-b8b0-4afb-9a81-a5ed7c53cd97-kube-api-access-4848w\") pod \"thanos-querier-6896fd559d-s8tfj\" (UID: \"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97\") " pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:57.969142 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:57.969103 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:54:58.123795 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.123772 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6896fd559d-s8tfj"] Apr 16 19:54:58.126430 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:58.126393 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa467cab_b8b0_4afb_9a81_a5ed7c53cd97.slice/crio-2199d6f03befef4a9cf3df05cfb614823f2511fe4a1f93fb3e4422c777041e81 WatchSource:0}: Error finding container 2199d6f03befef4a9cf3df05cfb614823f2511fe4a1f93fb3e4422c777041e81: Status 404 returned error can't find the container with id 2199d6f03befef4a9cf3df05cfb614823f2511fe4a1f93fb3e4422c777041e81 Apr 16 19:54:58.251141 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.251103 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77ddbf6cbb-9s7fq"] Apr 16 19:54:58.254724 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.254704 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.261716 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.261688 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 19:54:58.266303 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.266277 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77ddbf6cbb-9s7fq"] Apr 16 19:54:58.313763 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.313720 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n76zz\" (UniqueName: \"kubernetes.io/projected/3362b23b-452e-46f5-9fd6-86ecb0a197bf-kube-api-access-n76zz\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.313930 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.313780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-config\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.313930 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.313802 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-oauth-serving-cert\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.313930 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.313819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-service-ca\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.313930 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.313844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-trusted-ca-bundle\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.313930 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.313917 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-oauth-config\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.314110 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.313963 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-serving-cert\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.414407 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.414306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-oauth-config\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.414407 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.414355 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-serving-cert\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.414407 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.414390 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n76zz\" (UniqueName: \"kubernetes.io/projected/3362b23b-452e-46f5-9fd6-86ecb0a197bf-kube-api-access-n76zz\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.414685 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.414423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-config\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.414685 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.414449 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-oauth-serving-cert\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.414685 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.414476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-service-ca\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.414685 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.414551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-trusted-ca-bundle\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.415351 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.415322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-oauth-serving-cert\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.415463 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.415332 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-service-ca\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.415463 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.415409 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-trusted-ca-bundle\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.415463 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.415431 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-config\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.416908 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.416887 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-serving-cert\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.417020 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.416911 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-oauth-config\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.423111 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.423093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n76zz\" (UniqueName: \"kubernetes.io/projected/3362b23b-452e-46f5-9fd6-86ecb0a197bf-kube-api-access-n76zz\") pod \"console-77ddbf6cbb-9s7fq\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.482212 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.482176 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" event={"ID":"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97","Type":"ContainerStarted","Data":"2199d6f03befef4a9cf3df05cfb614823f2511fe4a1f93fb3e4422c777041e81"} Apr 16 19:54:58.483455 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.483432 2570 generic.go:358] "Generic (PLEG): container finished" podID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerID="b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99" exitCode=0 Apr 16 19:54:58.483573 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.483523 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerDied","Data":"b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99"} Apr 16 19:54:58.485511 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.485486 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" event={"ID":"5193f0a6-8fbd-4c14-b92c-1eae57fc248b","Type":"ContainerStarted","Data":"4dc5fe3746f4b8ffddacdd817257a911997b7ba255f6ac60ede7b6119442d996"} Apr 16 19:54:58.530643 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.530588 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-f47kp" podStartSLOduration=3.051537177 podStartE2EDuration="4.53057323s" podCreationTimestamp="2026-04-16 19:54:54 +0000 UTC" firstStartedPulling="2026-04-16 19:54:55.997190138 +0000 UTC m=+59.532056153" lastFinishedPulling="2026-04-16 19:54:57.476226178 +0000 UTC m=+61.011092206" observedRunningTime="2026-04-16 19:54:58.528898698 +0000 UTC m=+62.063764724" watchObservedRunningTime="2026-04-16 19:54:58.53057323 +0000 UTC m=+62.065439295" Apr 16 19:54:58.563384 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.563349 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:54:58.682723 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:58.682649 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77ddbf6cbb-9s7fq"] Apr 16 19:54:58.685771 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:54:58.685739 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3362b23b_452e_46f5_9fd6_86ecb0a197bf.slice/crio-8620abb2899af328170c23f3998075b4cccead5a35037929663831e62ba68154 WatchSource:0}: Error finding container 8620abb2899af328170c23f3998075b4cccead5a35037929663831e62ba68154: Status 404 returned error can't find the container with id 8620abb2899af328170c23f3998075b4cccead5a35037929663831e62ba68154 Apr 16 19:54:59.239830 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.239792 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-57c6995b97-7nqjd"] Apr 16 19:54:59.243355 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.243330 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.246544 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.246516 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-5eh1ldst9iju3\"" Apr 16 19:54:59.248374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.247781 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-5nmtw\"" Apr 16 19:54:59.248374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.247805 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 19:54:59.248374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.247913 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 19:54:59.248374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.247944 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 19:54:59.248374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.248076 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:54:59.254178 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.253908 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57c6995b97-7nqjd"] Apr 16 19:54:59.323462 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.323418 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-secret-metrics-server-tls\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.323631 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.323494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-metrics-server-audit-profiles\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.323631 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.323519 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.323631 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.323552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-secret-metrics-server-client-certs\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.323790 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.323639 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmx7j\" (UniqueName: \"kubernetes.io/projected/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-kube-api-access-kmx7j\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.323790 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.323706 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-audit-log\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.323790 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.323773 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-client-ca-bundle\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.413448 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.413414 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw"] Apr 16 19:54:59.416497 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.416475 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" Apr 16 19:54:59.419235 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.419199 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 19:54:59.419235 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.419255 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-qfh44\"" Apr 16 19:54:59.424556 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.424513 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw"] Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.425292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-audit-log\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.425366 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-client-ca-bundle\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.425431 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-secret-metrics-server-tls\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.425496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-metrics-server-audit-profiles\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.425530 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.425568 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-secret-metrics-server-client-certs\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.425601 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmx7j\" (UniqueName: \"kubernetes.io/projected/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-kube-api-access-kmx7j\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.426206 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-audit-log\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.426881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-metrics-server-audit-profiles\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.427781 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.427476 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.428896 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.428868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-client-ca-bundle\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.429372 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.429347 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-secret-metrics-server-tls\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.431240 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.431217 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-secret-metrics-server-client-certs\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.441115 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.441040 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmx7j\" (UniqueName: \"kubernetes.io/projected/4099f14d-0af4-4fc9-ad94-aa5abbfc67ce-kube-api-access-kmx7j\") pod \"metrics-server-57c6995b97-7nqjd\" (UID: \"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce\") " pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.443269 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.443245 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nw756" Apr 16 19:54:59.490808 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.490717 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77ddbf6cbb-9s7fq" event={"ID":"3362b23b-452e-46f5-9fd6-86ecb0a197bf","Type":"ContainerStarted","Data":"3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae"} Apr 16 19:54:59.490808 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.490773 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77ddbf6cbb-9s7fq" event={"ID":"3362b23b-452e-46f5-9fd6-86ecb0a197bf","Type":"ContainerStarted","Data":"8620abb2899af328170c23f3998075b4cccead5a35037929663831e62ba68154"} Apr 16 19:54:59.508276 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.507830 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77ddbf6cbb-9s7fq" podStartSLOduration=1.507808507 podStartE2EDuration="1.507808507s" podCreationTimestamp="2026-04-16 19:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:59.507689479 +0000 UTC m=+63.042555516" watchObservedRunningTime="2026-04-16 19:54:59.507808507 +0000 UTC m=+63.042674543" Apr 16 19:54:59.526864 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.526831 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8995f8ed-4eab-4bb2-a5a7-2df79fd493f4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-65ndw\" (UID: \"8995f8ed-4eab-4bb2-a5a7-2df79fd493f4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" Apr 16 19:54:59.556886 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.556840 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:54:59.628044 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.628009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8995f8ed-4eab-4bb2-a5a7-2df79fd493f4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-65ndw\" (UID: \"8995f8ed-4eab-4bb2-a5a7-2df79fd493f4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" Apr 16 19:54:59.631140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.631101 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8995f8ed-4eab-4bb2-a5a7-2df79fd493f4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-65ndw\" (UID: \"8995f8ed-4eab-4bb2-a5a7-2df79fd493f4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" Apr 16 19:54:59.729564 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:54:59.729128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" Apr 16 19:55:00.115808 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:00.115775 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57c6995b97-7nqjd"] Apr 16 19:55:00.138547 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:00.138515 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw"] Apr 16 19:55:00.816040 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:55:00.815999 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4099f14d_0af4_4fc9_ad94_aa5abbfc67ce.slice/crio-3d5386e12d49fc25ff79bedb72680bf510b9ae13314f54b51aae9b3b03e17554 WatchSource:0}: Error finding container 3d5386e12d49fc25ff79bedb72680bf510b9ae13314f54b51aae9b3b03e17554: Status 404 returned error can't find the container with id 3d5386e12d49fc25ff79bedb72680bf510b9ae13314f54b51aae9b3b03e17554 Apr 16 19:55:00.817132 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:55:00.816951 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8995f8ed_4eab_4bb2_a5a7_2df79fd493f4.slice/crio-5131ded588410e991bf43e32368cbd3d3addaad6f8af8e05e8f320ba5f019153 WatchSource:0}: Error finding container 5131ded588410e991bf43e32368cbd3d3addaad6f8af8e05e8f320ba5f019153: Status 404 returned error can't find the container with id 5131ded588410e991bf43e32368cbd3d3addaad6f8af8e05e8f320ba5f019153 Apr 16 19:55:01.007186 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.007164 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:55:01.007300 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.007200 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:55:01.011821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.011802 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:55:01.018357 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.018330 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:55:01.501892 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.501858 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" event={"ID":"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97","Type":"ContainerStarted","Data":"1a89a3ff74795af13eca9e55eff5e110ebe39ac7ff6ce3e27b7222ea958cec2d"} Apr 16 19:55:01.502104 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.501896 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" event={"ID":"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97","Type":"ContainerStarted","Data":"bfddd039ddf3c7cb2659de6363966d316a07aadf75e8a941501ddb08e47e9436"} Apr 16 19:55:01.502104 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.501916 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" event={"ID":"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97","Type":"ContainerStarted","Data":"1caac5dc43691b9408b26aca1429d72baa0540566f040e561629fbf9cedc15d5"} Apr 16 19:55:01.504821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.504717 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerStarted","Data":"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453"} Apr 16 19:55:01.504821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.504752 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerStarted","Data":"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e"} Apr 16 19:55:01.504821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.504766 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerStarted","Data":"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89"} Apr 16 19:55:01.504821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.504777 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerStarted","Data":"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340"} Apr 16 19:55:01.506005 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.505946 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" event={"ID":"8995f8ed-4eab-4bb2-a5a7-2df79fd493f4","Type":"ContainerStarted","Data":"5131ded588410e991bf43e32368cbd3d3addaad6f8af8e05e8f320ba5f019153"} Apr 16 19:55:01.507280 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.507226 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" event={"ID":"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce","Type":"ContainerStarted","Data":"3d5386e12d49fc25ff79bedb72680bf510b9ae13314f54b51aae9b3b03e17554"} Apr 16 19:55:01.853321 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.853287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:55:01.853756 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.853383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:55:01.856409 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.856378 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:55:01.856577 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.856406 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:55:01.866604 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.866434 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:55:01.866727 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.866664 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1-metrics-certs\") pod \"network-metrics-daemon-8x8wb\" (UID: \"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1\") " pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:55:01.877385 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:01.877349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l96v5\" (UniqueName: \"kubernetes.io/projected/e33890cf-1250-4378-b5a9-3ef264f23dad-kube-api-access-l96v5\") pod \"network-check-target-rt6tz\" (UID: \"e33890cf-1250-4378-b5a9-3ef264f23dad\") " pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:55:02.145791 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:02.145705 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p7kgs\"" Apr 16 19:55:02.153514 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:02.153476 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:55:02.154325 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:02.154092 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2xst7\"" Apr 16 19:55:02.162096 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:02.162046 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8x8wb" Apr 16 19:55:02.513943 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:02.513910 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerStarted","Data":"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54"} Apr 16 19:55:03.232413 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.232382 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8x8wb"] Apr 16 19:55:03.235924 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:55:03.235889 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a9e66bb_af9f_4bcc_9d1f_a9b851f41ba1.slice/crio-10764d9159a5e236b9d76229d5f445acc3d81108e7291ca9d494280061d23d6f WatchSource:0}: Error finding container 10764d9159a5e236b9d76229d5f445acc3d81108e7291ca9d494280061d23d6f: Status 404 returned error can't find the container with id 10764d9159a5e236b9d76229d5f445acc3d81108e7291ca9d494280061d23d6f Apr 16 19:55:03.245683 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.244728 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rt6tz"] Apr 16 19:55:03.249873 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:55:03.249843 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode33890cf_1250_4378_b5a9_3ef264f23dad.slice/crio-68ec5c0926ce85ffe2f4dc57ce0c3edead8e0da9063d0926161a7be280c1517e WatchSource:0}: Error finding container 68ec5c0926ce85ffe2f4dc57ce0c3edead8e0da9063d0926161a7be280c1517e: Status 404 returned error can't find the container with id 68ec5c0926ce85ffe2f4dc57ce0c3edead8e0da9063d0926161a7be280c1517e Apr 16 19:55:03.520482 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.520437 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" event={"ID":"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97","Type":"ContainerStarted","Data":"dd98646a914569cb5ae641e679d1c2b19d1e86a74d52a065fd5a9ed5088d419c"} Apr 16 19:55:03.520641 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.520489 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" event={"ID":"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97","Type":"ContainerStarted","Data":"960eb065f9261061323a30aa2aba23372231587542c7e2f728136df6fd124239"} Apr 16 19:55:03.520641 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.520506 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" event={"ID":"fa467cab-b8b0-4afb-9a81-a5ed7c53cd97","Type":"ContainerStarted","Data":"465ac8405ad07c68ea5bdf680ab78642ad94aca1a5019994344a085a8c16d5bb"} Apr 16 19:55:03.520641 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.520594 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:55:03.523445 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.523413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerStarted","Data":"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a"} Apr 16 19:55:03.524859 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.524832 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" event={"ID":"8995f8ed-4eab-4bb2-a5a7-2df79fd493f4","Type":"ContainerStarted","Data":"1fdaaab6643e16581f616729b0e95983978f386341b25f8814dae2a2c1af53f1"} Apr 16 19:55:03.525106 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.525076 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" Apr 16 19:55:03.525959 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.525938 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rt6tz" event={"ID":"e33890cf-1250-4378-b5a9-3ef264f23dad","Type":"ContainerStarted","Data":"68ec5c0926ce85ffe2f4dc57ce0c3edead8e0da9063d0926161a7be280c1517e"} Apr 16 19:55:03.527028 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.527006 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8x8wb" event={"ID":"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1","Type":"ContainerStarted","Data":"10764d9159a5e236b9d76229d5f445acc3d81108e7291ca9d494280061d23d6f"} Apr 16 19:55:03.528532 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.528510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" event={"ID":"4099f14d-0af4-4fc9-ad94-aa5abbfc67ce","Type":"ContainerStarted","Data":"48ddc284990ca01ded2013101250448b31311edaf7f037e0feae312d89c79499"} Apr 16 19:55:03.530584 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.530567 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" Apr 16 19:55:03.550844 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.550715 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" podStartSLOduration=1.6219199770000001 podStartE2EDuration="6.550696529s" podCreationTimestamp="2026-04-16 19:54:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:58.12837957 +0000 UTC m=+61.663245584" lastFinishedPulling="2026-04-16 19:55:03.057156119 +0000 UTC m=+66.592022136" observedRunningTime="2026-04-16 19:55:03.549092876 +0000 UTC m=+67.083958912" watchObservedRunningTime="2026-04-16 19:55:03.550696529 +0000 UTC m=+67.085562566" Apr 16 19:55:03.586726 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.586668 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.356385807 podStartE2EDuration="8.586653183s" podCreationTimestamp="2026-04-16 19:54:55 +0000 UTC" firstStartedPulling="2026-04-16 19:54:56.828720091 +0000 UTC m=+60.363586106" lastFinishedPulling="2026-04-16 19:55:03.058987467 +0000 UTC m=+66.593853482" observedRunningTime="2026-04-16 19:55:03.585469123 +0000 UTC m=+67.120335160" watchObservedRunningTime="2026-04-16 19:55:03.586653183 +0000 UTC m=+67.121519218" Apr 16 19:55:03.618368 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.618321 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" podStartSLOduration=2.549259339 podStartE2EDuration="4.618305292s" podCreationTimestamp="2026-04-16 19:54:59 +0000 UTC" firstStartedPulling="2026-04-16 19:55:00.987998889 +0000 UTC m=+64.522864903" lastFinishedPulling="2026-04-16 19:55:03.057044841 +0000 UTC m=+66.591910856" observedRunningTime="2026-04-16 19:55:03.618123781 +0000 UTC m=+67.152989818" watchObservedRunningTime="2026-04-16 19:55:03.618305292 +0000 UTC m=+67.153171328" Apr 16 19:55:03.655333 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:03.654976 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65ndw" podStartSLOduration=2.5854015219999997 podStartE2EDuration="4.654958084s" podCreationTimestamp="2026-04-16 19:54:59 +0000 UTC" firstStartedPulling="2026-04-16 19:55:00.987997784 +0000 UTC m=+64.522863797" lastFinishedPulling="2026-04-16 19:55:03.05755434 +0000 UTC m=+66.592420359" observedRunningTime="2026-04-16 19:55:03.653755556 +0000 UTC m=+67.188621594" watchObservedRunningTime="2026-04-16 19:55:03.654958084 +0000 UTC m=+67.189824123" Apr 16 19:55:04.066203 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.066170 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77ddbf6cbb-9s7fq"] Apr 16 19:55:04.101338 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.101303 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8595f7f78f-8hnwp"] Apr 16 19:55:04.105895 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.105866 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.114611 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.114162 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8595f7f78f-8hnwp"] Apr 16 19:55:04.174757 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.174698 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-oauth-serving-cert\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.174936 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.174822 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-oauth-config\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.174936 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.174872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-serving-cert\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.174936 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.174918 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnz6f\" (UniqueName: \"kubernetes.io/projected/2b015b01-9339-426c-9bfa-195d80ec1a92-kube-api-access-qnz6f\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.175135 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.175014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-service-ca\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.175135 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.175105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-trusted-ca-bundle\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.175215 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.175164 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-console-config\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.276085 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.276035 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-oauth-config\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.276543 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.276106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-serving-cert\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.276543 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.276155 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnz6f\" (UniqueName: \"kubernetes.io/projected/2b015b01-9339-426c-9bfa-195d80ec1a92-kube-api-access-qnz6f\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.276543 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.276193 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-service-ca\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.276543 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.276218 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-trusted-ca-bundle\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.276543 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.276258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-console-config\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.276543 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.276302 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-oauth-serving-cert\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.277085 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.277008 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-oauth-serving-cert\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.277400 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.277312 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-console-config\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.277521 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.277422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-service-ca\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.277625 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.277597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-trusted-ca-bundle\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.278972 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.278948 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-oauth-config\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.279295 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.279275 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-serving-cert\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.285784 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.285728 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnz6f\" (UniqueName: \"kubernetes.io/projected/2b015b01-9339-426c-9bfa-195d80ec1a92-kube-api-access-qnz6f\") pod \"console-8595f7f78f-8hnwp\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.419141 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.419111 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:04.539169 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.539109 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8x8wb" event={"ID":"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1","Type":"ContainerStarted","Data":"85ceb36668684a4b0c268f95ff26e0f3162866318501ce1f686964c3e91040a8"} Apr 16 19:55:04.590494 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:04.590467 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8595f7f78f-8hnwp"] Apr 16 19:55:04.592927 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:55:04.592894 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b015b01_9339_426c_9bfa_195d80ec1a92.slice/crio-63693ca9f7bea7f90c4a346bd9ff8b685902fdec2bc39f7d882556459d298af1 WatchSource:0}: Error finding container 63693ca9f7bea7f90c4a346bd9ff8b685902fdec2bc39f7d882556459d298af1: Status 404 returned error can't find the container with id 63693ca9f7bea7f90c4a346bd9ff8b685902fdec2bc39f7d882556459d298af1 Apr 16 19:55:05.545427 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:05.545387 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8x8wb" event={"ID":"4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1","Type":"ContainerStarted","Data":"ec60e14280fd14c7b4f608562080ac7b274bb3553e4329418dd8e60d052210cc"} Apr 16 19:55:05.547160 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:05.547096 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8595f7f78f-8hnwp" event={"ID":"2b015b01-9339-426c-9bfa-195d80ec1a92","Type":"ContainerStarted","Data":"c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f"} Apr 16 19:55:05.547160 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:05.547134 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8595f7f78f-8hnwp" event={"ID":"2b015b01-9339-426c-9bfa-195d80ec1a92","Type":"ContainerStarted","Data":"63693ca9f7bea7f90c4a346bd9ff8b685902fdec2bc39f7d882556459d298af1"} Apr 16 19:55:05.583351 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:05.583289 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8595f7f78f-8hnwp" podStartSLOduration=1.58326838 podStartE2EDuration="1.58326838s" podCreationTimestamp="2026-04-16 19:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:05.58239102 +0000 UTC m=+69.117257057" watchObservedRunningTime="2026-04-16 19:55:05.58326838 +0000 UTC m=+69.118134413" Apr 16 19:55:05.584219 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:05.584181 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8x8wb" podStartSLOduration=67.527405322 podStartE2EDuration="1m8.584169879s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.237936471 +0000 UTC m=+66.772802485" lastFinishedPulling="2026-04-16 19:55:04.294701027 +0000 UTC m=+67.829567042" observedRunningTime="2026-04-16 19:55:05.56170194 +0000 UTC m=+69.096567979" watchObservedRunningTime="2026-04-16 19:55:05.584169879 +0000 UTC m=+69.119035914" Apr 16 19:55:06.551458 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:06.551419 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rt6tz" event={"ID":"e33890cf-1250-4378-b5a9-3ef264f23dad","Type":"ContainerStarted","Data":"71b3e31ad37734ab0fbc54af8d535e192e0ee3c3f2b4081a087daaebd3401f30"} Apr 16 19:55:06.551845 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:06.551563 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:55:06.569487 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:06.569441 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rt6tz" podStartSLOduration=66.703927487 podStartE2EDuration="1m9.569426189s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.252075822 +0000 UTC m=+66.786941841" lastFinishedPulling="2026-04-16 19:55:06.117574515 +0000 UTC m=+69.652440543" observedRunningTime="2026-04-16 19:55:06.568349197 +0000 UTC m=+70.103215232" watchObservedRunningTime="2026-04-16 19:55:06.569426189 +0000 UTC m=+70.104292226" Apr 16 19:55:08.563728 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:08.563685 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:55:09.544957 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:09.544930 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6896fd559d-s8tfj" Apr 16 19:55:14.419227 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:14.419183 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:14.419227 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:14.419239 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:14.423801 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:14.423778 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:14.580835 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:14.580808 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:55:14.647964 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:14.647934 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6984c5ccbb-99m8p"] Apr 16 19:55:19.557201 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:19.557073 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:55:19.557201 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:19.557115 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:55:29.090572 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.090509 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77ddbf6cbb-9s7fq" podUID="3362b23b-452e-46f5-9fd6-86ecb0a197bf" containerName="console" containerID="cri-o://3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae" gracePeriod=15 Apr 16 19:55:29.331542 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.331518 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77ddbf6cbb-9s7fq_3362b23b-452e-46f5-9fd6-86ecb0a197bf/console/0.log" Apr 16 19:55:29.331690 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.331608 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:55:29.501047 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501014 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-config\") pod \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " Apr 16 19:55:29.501222 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501115 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-oauth-serving-cert\") pod \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " Apr 16 19:55:29.501222 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501135 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-service-ca\") pod \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " Apr 16 19:55:29.501222 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501153 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n76zz\" (UniqueName: \"kubernetes.io/projected/3362b23b-452e-46f5-9fd6-86ecb0a197bf-kube-api-access-n76zz\") pod \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " Apr 16 19:55:29.501222 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501177 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-trusted-ca-bundle\") pod \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " Apr 16 19:55:29.501222 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501203 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-oauth-config\") pod \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " Apr 16 19:55:29.501446 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501232 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-serving-cert\") pod \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\" (UID: \"3362b23b-452e-46f5-9fd6-86ecb0a197bf\") " Apr 16 19:55:29.501606 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501572 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-config" (OuterVolumeSpecName: "console-config") pod "3362b23b-452e-46f5-9fd6-86ecb0a197bf" (UID: "3362b23b-452e-46f5-9fd6-86ecb0a197bf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:29.501730 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501572 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-service-ca" (OuterVolumeSpecName: "service-ca") pod "3362b23b-452e-46f5-9fd6-86ecb0a197bf" (UID: "3362b23b-452e-46f5-9fd6-86ecb0a197bf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:29.501791 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501716 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3362b23b-452e-46f5-9fd6-86ecb0a197bf" (UID: "3362b23b-452e-46f5-9fd6-86ecb0a197bf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:29.501791 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.501726 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3362b23b-452e-46f5-9fd6-86ecb0a197bf" (UID: "3362b23b-452e-46f5-9fd6-86ecb0a197bf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:29.503499 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.503471 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3362b23b-452e-46f5-9fd6-86ecb0a197bf" (UID: "3362b23b-452e-46f5-9fd6-86ecb0a197bf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:29.503602 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.503504 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3362b23b-452e-46f5-9fd6-86ecb0a197bf" (UID: "3362b23b-452e-46f5-9fd6-86ecb0a197bf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:29.503602 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.503548 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3362b23b-452e-46f5-9fd6-86ecb0a197bf-kube-api-access-n76zz" (OuterVolumeSpecName: "kube-api-access-n76zz") pod "3362b23b-452e-46f5-9fd6-86ecb0a197bf" (UID: "3362b23b-452e-46f5-9fd6-86ecb0a197bf"). InnerVolumeSpecName "kube-api-access-n76zz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:29.602752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.602718 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-oauth-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.602752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.602746 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.602752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.602756 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-console-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.602973 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.602766 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-oauth-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.602973 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.602776 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-service-ca\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.602973 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.602785 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n76zz\" (UniqueName: \"kubernetes.io/projected/3362b23b-452e-46f5-9fd6-86ecb0a197bf-kube-api-access-n76zz\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.602973 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.602793 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3362b23b-452e-46f5-9fd6-86ecb0a197bf-trusted-ca-bundle\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.619638 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.619612 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77ddbf6cbb-9s7fq_3362b23b-452e-46f5-9fd6-86ecb0a197bf/console/0.log" Apr 16 19:55:29.619784 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.619654 2570 generic.go:358] "Generic (PLEG): container finished" podID="3362b23b-452e-46f5-9fd6-86ecb0a197bf" containerID="3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae" exitCode=2 Apr 16 19:55:29.619784 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.619718 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77ddbf6cbb-9s7fq" Apr 16 19:55:29.619784 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.619741 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77ddbf6cbb-9s7fq" event={"ID":"3362b23b-452e-46f5-9fd6-86ecb0a197bf","Type":"ContainerDied","Data":"3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae"} Apr 16 19:55:29.619784 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.619779 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77ddbf6cbb-9s7fq" event={"ID":"3362b23b-452e-46f5-9fd6-86ecb0a197bf","Type":"ContainerDied","Data":"8620abb2899af328170c23f3998075b4cccead5a35037929663831e62ba68154"} Apr 16 19:55:29.619949 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.619794 2570 scope.go:117] "RemoveContainer" containerID="3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae" Apr 16 19:55:29.628752 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.628732 2570 scope.go:117] "RemoveContainer" containerID="3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae" Apr 16 19:55:29.629038 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:55:29.629020 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae\": container with ID starting with 3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae not found: ID does not exist" containerID="3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae" Apr 16 19:55:29.629105 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.629047 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae"} err="failed to get container status \"3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae\": rpc error: code = NotFound desc = could not find container \"3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae\": container with ID starting with 3ee02ccd8168476dfc7c8fa2566da76270bef728ebfc93dda08ef889408d74ae not found: ID does not exist" Apr 16 19:55:29.641401 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.641377 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77ddbf6cbb-9s7fq"] Apr 16 19:55:29.646113 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:29.646092 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77ddbf6cbb-9s7fq"] Apr 16 19:55:31.111991 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:31.111956 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3362b23b-452e-46f5-9fd6-86ecb0a197bf" path="/var/lib/kubelet/pods/3362b23b-452e-46f5-9fd6-86ecb0a197bf/volumes" Apr 16 19:55:37.556888 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:37.556852 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rt6tz" Apr 16 19:55:39.562640 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:39.562609 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:55:39.566676 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:39.566650 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-57c6995b97-7nqjd" Apr 16 19:55:39.670334 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:39.670274 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6984c5ccbb-99m8p" podUID="18958238-c99b-4a8f-94f1-659781539d41" containerName="console" containerID="cri-o://6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89" gracePeriod=15 Apr 16 19:55:39.913291 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:39.913268 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6984c5ccbb-99m8p_18958238-c99b-4a8f-94f1-659781539d41/console/0.log" Apr 16 19:55:39.913426 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:39.913328 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:55:40.089149 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089008 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-oauth-serving-cert\") pod \"18958238-c99b-4a8f-94f1-659781539d41\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " Apr 16 19:55:40.089149 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089152 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-service-ca\") pod \"18958238-c99b-4a8f-94f1-659781539d41\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " Apr 16 19:55:40.089393 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089183 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-console-config\") pod \"18958238-c99b-4a8f-94f1-659781539d41\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " Apr 16 19:55:40.089393 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089208 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-oauth-config\") pod \"18958238-c99b-4a8f-94f1-659781539d41\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " Apr 16 19:55:40.089393 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089232 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-serving-cert\") pod \"18958238-c99b-4a8f-94f1-659781539d41\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " Apr 16 19:55:40.089393 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089354 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7m4c\" (UniqueName: \"kubernetes.io/projected/18958238-c99b-4a8f-94f1-659781539d41-kube-api-access-j7m4c\") pod \"18958238-c99b-4a8f-94f1-659781539d41\" (UID: \"18958238-c99b-4a8f-94f1-659781539d41\") " Apr 16 19:55:40.089594 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089503 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "18958238-c99b-4a8f-94f1-659781539d41" (UID: "18958238-c99b-4a8f-94f1-659781539d41"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:40.089660 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089596 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-service-ca" (OuterVolumeSpecName: "service-ca") pod "18958238-c99b-4a8f-94f1-659781539d41" (UID: "18958238-c99b-4a8f-94f1-659781539d41"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:40.089660 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089603 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-console-config" (OuterVolumeSpecName: "console-config") pod "18958238-c99b-4a8f-94f1-659781539d41" (UID: "18958238-c99b-4a8f-94f1-659781539d41"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:40.089750 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.089667 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-oauth-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:40.091467 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.091433 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18958238-c99b-4a8f-94f1-659781539d41-kube-api-access-j7m4c" (OuterVolumeSpecName: "kube-api-access-j7m4c") pod "18958238-c99b-4a8f-94f1-659781539d41" (UID: "18958238-c99b-4a8f-94f1-659781539d41"). InnerVolumeSpecName "kube-api-access-j7m4c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:40.091582 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.091488 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "18958238-c99b-4a8f-94f1-659781539d41" (UID: "18958238-c99b-4a8f-94f1-659781539d41"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:40.091582 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.091519 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "18958238-c99b-4a8f-94f1-659781539d41" (UID: "18958238-c99b-4a8f-94f1-659781539d41"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:40.190314 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.190256 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-service-ca\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:40.190314 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.190306 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18958238-c99b-4a8f-94f1-659781539d41-console-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:40.190314 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.190316 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-oauth-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:40.190314 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.190325 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18958238-c99b-4a8f-94f1-659781539d41-console-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:40.190314 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.190335 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j7m4c\" (UniqueName: \"kubernetes.io/projected/18958238-c99b-4a8f-94f1-659781539d41-kube-api-access-j7m4c\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:55:40.657301 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.657274 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6984c5ccbb-99m8p_18958238-c99b-4a8f-94f1-659781539d41/console/0.log" Apr 16 19:55:40.657713 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.657317 2570 generic.go:358] "Generic (PLEG): container finished" podID="18958238-c99b-4a8f-94f1-659781539d41" containerID="6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89" exitCode=2 Apr 16 19:55:40.657713 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.657350 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6984c5ccbb-99m8p" event={"ID":"18958238-c99b-4a8f-94f1-659781539d41","Type":"ContainerDied","Data":"6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89"} Apr 16 19:55:40.657713 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.657394 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6984c5ccbb-99m8p" Apr 16 19:55:40.657713 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.657399 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6984c5ccbb-99m8p" event={"ID":"18958238-c99b-4a8f-94f1-659781539d41","Type":"ContainerDied","Data":"b30fe855afbf0de108bac8794508bbdda134d0f59fa3f3bfa11249045caaa398"} Apr 16 19:55:40.657713 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.657419 2570 scope.go:117] "RemoveContainer" containerID="6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89" Apr 16 19:55:40.667181 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.667022 2570 scope.go:117] "RemoveContainer" containerID="6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89" Apr 16 19:55:40.667366 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:55:40.667337 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89\": container with ID starting with 6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89 not found: ID does not exist" containerID="6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89" Apr 16 19:55:40.667476 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.667377 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89"} err="failed to get container status \"6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89\": rpc error: code = NotFound desc = could not find container \"6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89\": container with ID starting with 6980a5128df4ae503a1fcd881194b29f5cc639c0ed7eb2d8c56207af78f2cc89 not found: ID does not exist" Apr 16 19:55:40.680603 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.680577 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6984c5ccbb-99m8p"] Apr 16 19:55:40.683505 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:40.683477 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6984c5ccbb-99m8p"] Apr 16 19:55:41.110790 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:55:41.110759 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18958238-c99b-4a8f-94f1-659781539d41" path="/var/lib/kubelet/pods/18958238-c99b-4a8f-94f1-659781539d41/volumes" Apr 16 19:56:14.993393 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:14.993356 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:14.993887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:14.993811 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy-web" containerID="cri-o://7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e" gracePeriod=120 Apr 16 19:56:14.993887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:14.993808 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy-metric" containerID="cri-o://4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54" gracePeriod=120 Apr 16 19:56:14.993887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:14.993836 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="config-reloader" containerID="cri-o://236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89" gracePeriod=120 Apr 16 19:56:14.993887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:14.993867 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy" containerID="cri-o://af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453" gracePeriod=120 Apr 16 19:56:14.994116 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:14.993880 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="prom-label-proxy" containerID="cri-o://6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a" gracePeriod=120 Apr 16 19:56:14.994116 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:14.993789 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="alertmanager" containerID="cri-o://a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340" gracePeriod=120 Apr 16 19:56:15.755966 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:15.755925 2570 generic.go:358] "Generic (PLEG): container finished" podID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerID="6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a" exitCode=0 Apr 16 19:56:15.755966 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:15.755962 2570 generic.go:358] "Generic (PLEG): container finished" podID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerID="af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453" exitCode=0 Apr 16 19:56:15.755966 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:15.755969 2570 generic.go:358] "Generic (PLEG): container finished" podID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerID="236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89" exitCode=0 Apr 16 19:56:15.755966 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:15.755975 2570 generic.go:358] "Generic (PLEG): container finished" podID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerID="a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340" exitCode=0 Apr 16 19:56:15.756243 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:15.755991 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerDied","Data":"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a"} Apr 16 19:56:15.756243 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:15.756023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerDied","Data":"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453"} Apr 16 19:56:15.756243 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:15.756033 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerDied","Data":"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89"} Apr 16 19:56:15.756243 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:15.756043 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerDied","Data":"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340"} Apr 16 19:56:16.237508 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.237485 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.374977 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.374889 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-web-config\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.374977 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.374934 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.374977 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.374955 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-tls-assets\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.374977 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.374978 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-cluster-tls-config\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375347 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.374996 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-out\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375347 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375017 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-volume\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375347 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375091 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375347 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375125 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-main-db\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375551 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375383 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:16.375551 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375523 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-web\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375555 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-metrics-client-ca\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375595 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22ccx\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-kube-api-access-22ccx\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375638 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.375655 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375636 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:16.375841 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375699 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\" (UID: \"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6\") " Apr 16 19:56:16.376281 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375946 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.376281 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.375980 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-alertmanager-main-db\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.376281 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.376092 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:16.377794 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.377754 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-out" (OuterVolumeSpecName: "config-out") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:16.378032 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.377999 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-volume" (OuterVolumeSpecName: "config-volume") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:16.378356 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.378330 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:16.378548 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.378517 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:16.378640 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.378526 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:16.378928 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.378907 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:16.380008 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.379987 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-kube-api-access-22ccx" (OuterVolumeSpecName: "kube-api-access-22ccx") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "kube-api-access-22ccx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:16.380328 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.380307 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:16.382438 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.382413 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:16.387824 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.387803 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-web-config" (OuterVolumeSpecName: "web-config") pod "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" (UID: "bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:16.476684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476635 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476682 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-metrics-client-ca\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476695 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22ccx\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-kube-api-access-22ccx\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476916 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476706 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476916 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476715 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476916 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476727 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-web-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476916 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476737 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-tls-assets\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476916 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476744 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-cluster-tls-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476916 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476753 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-out\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476916 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476760 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-config-volume\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.476916 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.476768 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6-secret-alertmanager-main-tls\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:16.761102 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.761048 2570 generic.go:358] "Generic (PLEG): container finished" podID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerID="4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54" exitCode=0 Apr 16 19:56:16.761102 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.761096 2570 generic.go:358] "Generic (PLEG): container finished" podID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerID="7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e" exitCode=0 Apr 16 19:56:16.761102 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.761091 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerDied","Data":"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54"} Apr 16 19:56:16.761376 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.761136 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerDied","Data":"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e"} Apr 16 19:56:16.761376 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.761149 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6","Type":"ContainerDied","Data":"f23350fd934034fe054e43694b893dd641bc60aadf38783626814e1b3e099406"} Apr 16 19:56:16.761376 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.761169 2570 scope.go:117] "RemoveContainer" containerID="6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a" Apr 16 19:56:16.761376 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.761198 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.768197 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.768180 2570 scope.go:117] "RemoveContainer" containerID="4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54" Apr 16 19:56:16.774831 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.774810 2570 scope.go:117] "RemoveContainer" containerID="af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453" Apr 16 19:56:16.781563 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.781545 2570 scope.go:117] "RemoveContainer" containerID="7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e" Apr 16 19:56:16.786719 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.786698 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:16.788538 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.788517 2570 scope.go:117] "RemoveContainer" containerID="236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89" Apr 16 19:56:16.790860 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.790840 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:16.795684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.795660 2570 scope.go:117] "RemoveContainer" containerID="a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340" Apr 16 19:56:16.801845 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.801823 2570 scope.go:117] "RemoveContainer" containerID="b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99" Apr 16 19:56:16.808159 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.808130 2570 scope.go:117] "RemoveContainer" containerID="6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a" Apr 16 19:56:16.808389 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:56:16.808368 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a\": container with ID starting with 6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a not found: ID does not exist" containerID="6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a" Apr 16 19:56:16.808448 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.808398 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a"} err="failed to get container status \"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a\": rpc error: code = NotFound desc = could not find container \"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a\": container with ID starting with 6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a not found: ID does not exist" Apr 16 19:56:16.808448 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.808415 2570 scope.go:117] "RemoveContainer" containerID="4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54" Apr 16 19:56:16.808636 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:56:16.808618 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54\": container with ID starting with 4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54 not found: ID does not exist" containerID="4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54" Apr 16 19:56:16.808672 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.808641 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54"} err="failed to get container status \"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54\": rpc error: code = NotFound desc = could not find container \"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54\": container with ID starting with 4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54 not found: ID does not exist" Apr 16 19:56:16.808672 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.808654 2570 scope.go:117] "RemoveContainer" containerID="af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453" Apr 16 19:56:16.808887 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:56:16.808870 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453\": container with ID starting with af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453 not found: ID does not exist" containerID="af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453" Apr 16 19:56:16.808920 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.808895 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453"} err="failed to get container status \"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453\": rpc error: code = NotFound desc = could not find container \"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453\": container with ID starting with af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453 not found: ID does not exist" Apr 16 19:56:16.808920 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.808915 2570 scope.go:117] "RemoveContainer" containerID="7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e" Apr 16 19:56:16.809164 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:56:16.809142 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e\": container with ID starting with 7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e not found: ID does not exist" containerID="7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e" Apr 16 19:56:16.809270 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.809166 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e"} err="failed to get container status \"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e\": rpc error: code = NotFound desc = could not find container \"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e\": container with ID starting with 7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e not found: ID does not exist" Apr 16 19:56:16.809270 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.809181 2570 scope.go:117] "RemoveContainer" containerID="236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89" Apr 16 19:56:16.809397 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:56:16.809382 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89\": container with ID starting with 236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89 not found: ID does not exist" containerID="236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89" Apr 16 19:56:16.809435 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.809400 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89"} err="failed to get container status \"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89\": rpc error: code = NotFound desc = could not find container \"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89\": container with ID starting with 236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89 not found: ID does not exist" Apr 16 19:56:16.809435 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.809412 2570 scope.go:117] "RemoveContainer" containerID="a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340" Apr 16 19:56:16.809629 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:56:16.809613 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340\": container with ID starting with a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340 not found: ID does not exist" containerID="a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340" Apr 16 19:56:16.809674 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.809633 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340"} err="failed to get container status \"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340\": rpc error: code = NotFound desc = could not find container \"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340\": container with ID starting with a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340 not found: ID does not exist" Apr 16 19:56:16.809674 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.809645 2570 scope.go:117] "RemoveContainer" containerID="b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99" Apr 16 19:56:16.809841 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:56:16.809827 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99\": container with ID starting with b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99 not found: ID does not exist" containerID="b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99" Apr 16 19:56:16.809880 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.809846 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99"} err="failed to get container status \"b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99\": rpc error: code = NotFound desc = could not find container \"b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99\": container with ID starting with b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99 not found: ID does not exist" Apr 16 19:56:16.809880 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.809859 2570 scope.go:117] "RemoveContainer" containerID="6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a" Apr 16 19:56:16.810040 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810025 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a"} err="failed to get container status \"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a\": rpc error: code = NotFound desc = could not find container \"6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a\": container with ID starting with 6b5f09cde6f41c477ded3fa13bc6946a3df228322cde7526aefb0d7e1533cd7a not found: ID does not exist" Apr 16 19:56:16.810115 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810040 2570 scope.go:117] "RemoveContainer" containerID="4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54" Apr 16 19:56:16.810284 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810262 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54"} err="failed to get container status \"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54\": rpc error: code = NotFound desc = could not find container \"4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54\": container with ID starting with 4bc5571fa7a21cee81f5eb0e732050d572835d6ae6064929e0e6bec6b8fb1d54 not found: ID does not exist" Apr 16 19:56:16.810348 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810285 2570 scope.go:117] "RemoveContainer" containerID="af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453" Apr 16 19:56:16.810519 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810487 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453"} err="failed to get container status \"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453\": rpc error: code = NotFound desc = could not find container \"af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453\": container with ID starting with af3a5bccb517b1f1e701e36a490e0f5a3d497afec9a4ded8688101030d4a5453 not found: ID does not exist" Apr 16 19:56:16.810561 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810521 2570 scope.go:117] "RemoveContainer" containerID="7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e" Apr 16 19:56:16.810713 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810698 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e"} err="failed to get container status \"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e\": rpc error: code = NotFound desc = could not find container \"7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e\": container with ID starting with 7687db0303a3beb2225fec9af69dabe03f7b9dd89273fd3a7b5821f47a0cdb8e not found: ID does not exist" Apr 16 19:56:16.810762 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810714 2570 scope.go:117] "RemoveContainer" containerID="236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89" Apr 16 19:56:16.810928 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810907 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89"} err="failed to get container status \"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89\": rpc error: code = NotFound desc = could not find container \"236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89\": container with ID starting with 236b0cfeb8267a17f74c4adb96db2e8f6c0fa5e444024e0b0a45c5df5e5e8b89 not found: ID does not exist" Apr 16 19:56:16.810968 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.810929 2570 scope.go:117] "RemoveContainer" containerID="a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340" Apr 16 19:56:16.811180 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.811163 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340"} err="failed to get container status \"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340\": rpc error: code = NotFound desc = could not find container \"a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340\": container with ID starting with a422f6cf61e892798ff6e7678ddf3ff67aeaa992368163204a64d0be5d3a9340 not found: ID does not exist" Apr 16 19:56:16.811230 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.811181 2570 scope.go:117] "RemoveContainer" containerID="b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99" Apr 16 19:56:16.811409 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.811392 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99"} err="failed to get container status \"b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99\": rpc error: code = NotFound desc = could not find container \"b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99\": container with ID starting with b8cc387f31f5b976001f1b54990c26abdb2b026cfd0a6fd5b761229380128b99 not found: ID does not exist" Apr 16 19:56:16.818628 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.818604 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:16.819071 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819040 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="config-reloader" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819073 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="config-reloader" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819086 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="alertmanager" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819092 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="alertmanager" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819100 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="init-config-reloader" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819105 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="init-config-reloader" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819115 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy-web" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819120 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy-web" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819126 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819131 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819139 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy-metric" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819144 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy-metric" Apr 16 19:56:16.819150 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819153 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="prom-label-proxy" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819158 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="prom-label-proxy" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819166 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3362b23b-452e-46f5-9fd6-86ecb0a197bf" containerName="console" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819171 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3362b23b-452e-46f5-9fd6-86ecb0a197bf" containerName="console" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819179 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18958238-c99b-4a8f-94f1-659781539d41" containerName="console" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819184 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="18958238-c99b-4a8f-94f1-659781539d41" containerName="console" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819230 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy-web" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819241 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819248 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3362b23b-452e-46f5-9fd6-86ecb0a197bf" containerName="console" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819253 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="kube-rbac-proxy-metric" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819260 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="config-reloader" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819266 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="alertmanager" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819271 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" containerName="prom-label-proxy" Apr 16 19:56:16.819503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.819277 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="18958238-c99b-4a8f-94f1-659781539d41" containerName="console" Apr 16 19:56:16.825889 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.824137 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.827176 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.827148 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:56:16.827309 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.827175 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:56:16.827469 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.827452 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:56:16.827543 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.827488 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8qhhm\"" Apr 16 19:56:16.827603 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.827555 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:56:16.827603 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.827589 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:56:16.827705 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.827604 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:56:16.827925 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.827910 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:56:16.828141 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.828125 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:56:16.833389 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.833368 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:56:16.841332 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.841305 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:16.879299 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879264 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmwp\" (UniqueName: \"kubernetes.io/projected/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-kube-api-access-4hmwp\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879457 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879325 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879457 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879354 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879457 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879457 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879390 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-config-out\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879608 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879476 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879608 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879507 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-web-config\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879608 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879524 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879608 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879542 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879736 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879605 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879736 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879646 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879736 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.879736 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.879694 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980651 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980611 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980651 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980694 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-config-out\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980763 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-web-config\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980784 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.980887 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.981323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980912 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.981323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980939 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.981323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.980986 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmwp\" (UniqueName: \"kubernetes.io/projected/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-kube-api-access-4hmwp\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.981323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.981096 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.983600 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.983563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-config-out\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.983748 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.983826 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.983825 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.983845 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.983943 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-web-config\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.984094 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984474 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.984401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984474 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.984420 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.984548 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.984504 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.985685 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.985667 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:16.989840 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:16.989821 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmwp\" (UniqueName: \"kubernetes.io/projected/58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0-kube-api-access-4hmwp\") pod \"alertmanager-main-0\" (UID: \"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:17.110666 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:17.110631 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6" path="/var/lib/kubelet/pods/bbf9b856-3ec2-4edd-beaa-0aebf8bb41f6/volumes" Apr 16 19:56:17.135363 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:17.135331 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:17.269873 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:56:17.269835 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58ad724e_d6fb_4e83_96a1_0cdd2d3a50b0.slice/crio-7a21e65da229fa7e4e982c68c08555a04239852bb8b092ffcaf5f0d2c1c34969 WatchSource:0}: Error finding container 7a21e65da229fa7e4e982c68c08555a04239852bb8b092ffcaf5f0d2c1c34969: Status 404 returned error can't find the container with id 7a21e65da229fa7e4e982c68c08555a04239852bb8b092ffcaf5f0d2c1c34969 Apr 16 19:56:17.272120 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:17.272096 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:17.766087 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:17.766039 2570 generic.go:358] "Generic (PLEG): container finished" podID="58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0" containerID="52fcce137df74c790488d204852d04a3b4ce876f14e87a96020083b9e3a9b82b" exitCode=0 Apr 16 19:56:17.766250 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:17.766109 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0","Type":"ContainerDied","Data":"52fcce137df74c790488d204852d04a3b4ce876f14e87a96020083b9e3a9b82b"} Apr 16 19:56:17.766250 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:17.766128 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0","Type":"ContainerStarted","Data":"7a21e65da229fa7e4e982c68c08555a04239852bb8b092ffcaf5f0d2c1c34969"} Apr 16 19:56:18.772168 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:18.772131 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0","Type":"ContainerStarted","Data":"cb25a99b9f05c1980b3760b1a4ec7c97887152cdfcae74734b2a14013a5e1b9c"} Apr 16 19:56:18.772168 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:18.772168 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0","Type":"ContainerStarted","Data":"0cdbcb6744ceaf567c7031687fa3a18699991e8de7c526786db26133153c518f"} Apr 16 19:56:18.772683 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:18.772179 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0","Type":"ContainerStarted","Data":"90f2e0e9e6c132199dc2ed58d0f9726fb6d2251d3044aee70a7669ac78216dec"} Apr 16 19:56:18.772683 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:18.772190 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0","Type":"ContainerStarted","Data":"4f6a0698e3c69efadc8241a32ac75399babe6b76082e6082f52e4de06eb4dd36"} Apr 16 19:56:18.772683 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:18.772198 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0","Type":"ContainerStarted","Data":"2cebb3882575220bba7026848f9240e704211053fcc2568297b29717208b3bac"} Apr 16 19:56:18.772683 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:18.772206 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0","Type":"ContainerStarted","Data":"02bc3025ed195bc401d6b22ee9f394aad80b2f36d4ccdd81750869cae1d2a9a3"} Apr 16 19:56:18.800545 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:18.800491 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.800476036 podStartE2EDuration="2.800476036s" podCreationTimestamp="2026-04-16 19:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:18.798014296 +0000 UTC m=+142.332880331" watchObservedRunningTime="2026-04-16 19:56:18.800476036 +0000 UTC m=+142.335342071" Apr 16 19:56:19.012984 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.012935 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww"] Apr 16 19:56:19.017259 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.017231 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.021497 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.021012 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 19:56:19.021497 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.021273 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 19:56:19.021711 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.021509 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 19:56:19.021711 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.021650 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-bztx5\"" Apr 16 19:56:19.021846 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.021822 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 19:56:19.021908 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.021830 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 19:56:19.029867 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.029838 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww"] Apr 16 19:56:19.034040 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.034015 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 19:56:19.098579 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.098541 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-federate-client-tls\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.098749 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.098590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-serving-certs-ca-bundle\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.098749 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.098660 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-metrics-client-ca\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.098749 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.098724 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.098882 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.098762 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-secret-telemeter-client\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.098882 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.098785 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j82f\" (UniqueName: \"kubernetes.io/projected/fc862099-3346-4ac3-b874-451d17baebf9-kube-api-access-7j82f\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.098882 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.098860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-telemeter-client-tls\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.099020 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.098917 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.200098 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.200041 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.200365 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.200347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-federate-client-tls\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.200514 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.200499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-serving-certs-ca-bundle\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.200607 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.200595 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-metrics-client-ca\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.200705 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.200693 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.200811 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.200798 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-secret-telemeter-client\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.200895 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.200883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7j82f\" (UniqueName: \"kubernetes.io/projected/fc862099-3346-4ac3-b874-451d17baebf9-kube-api-access-7j82f\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.201002 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.200990 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-telemeter-client-tls\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.202336 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.202310 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-metrics-client-ca\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.202696 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.202665 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-serving-certs-ca-bundle\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.202801 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.202697 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc862099-3346-4ac3-b874-451d17baebf9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.204563 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.204535 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-federate-client-tls\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.204665 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.204587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-telemeter-client-tls\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.204778 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.204752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.204934 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.204911 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fc862099-3346-4ac3-b874-451d17baebf9-secret-telemeter-client\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.213145 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.213118 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j82f\" (UniqueName: \"kubernetes.io/projected/fc862099-3346-4ac3-b874-451d17baebf9-kube-api-access-7j82f\") pod \"telemeter-client-57b9c5ddb9-tsmww\" (UID: \"fc862099-3346-4ac3-b874-451d17baebf9\") " pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.333246 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.333144 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" Apr 16 19:56:19.475680 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.475645 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww"] Apr 16 19:56:19.478467 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:56:19.478439 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc862099_3346_4ac3_b874_451d17baebf9.slice/crio-5386e648dca520db6dee426bda99642aab9cd8769651201e1e5b80c17713b930 WatchSource:0}: Error finding container 5386e648dca520db6dee426bda99642aab9cd8769651201e1e5b80c17713b930: Status 404 returned error can't find the container with id 5386e648dca520db6dee426bda99642aab9cd8769651201e1e5b80c17713b930 Apr 16 19:56:19.777084 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:19.777024 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" event={"ID":"fc862099-3346-4ac3-b874-451d17baebf9","Type":"ContainerStarted","Data":"5386e648dca520db6dee426bda99642aab9cd8769651201e1e5b80c17713b930"} Apr 16 19:56:21.784448 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:21.784409 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" event={"ID":"fc862099-3346-4ac3-b874-451d17baebf9","Type":"ContainerStarted","Data":"e75318319efdab4c824deffa6305058982ec22f04558fd68d04acca27849f873"} Apr 16 19:56:21.784843 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:21.784453 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" event={"ID":"fc862099-3346-4ac3-b874-451d17baebf9","Type":"ContainerStarted","Data":"78685ba228fb937402ecd678b8bc5149772def97f21c3c9a31014ffed974311d"} Apr 16 19:56:21.784843 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:21.784469 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" event={"ID":"fc862099-3346-4ac3-b874-451d17baebf9","Type":"ContainerStarted","Data":"b013132775a0bc7c539dde20b97f214704d3a1f1dfde18b5335c09d3659c55ff"} Apr 16 19:56:21.806213 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:21.806165 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-57b9c5ddb9-tsmww" podStartSLOduration=1.725973204 podStartE2EDuration="3.806148997s" podCreationTimestamp="2026-04-16 19:56:18 +0000 UTC" firstStartedPulling="2026-04-16 19:56:19.480404347 +0000 UTC m=+143.015270360" lastFinishedPulling="2026-04-16 19:56:21.560580137 +0000 UTC m=+145.095446153" observedRunningTime="2026-04-16 19:56:21.803541231 +0000 UTC m=+145.338407267" watchObservedRunningTime="2026-04-16 19:56:21.806148997 +0000 UTC m=+145.341015191" Apr 16 19:56:23.031372 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.031329 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9dc7d57f7-t2m9f"] Apr 16 19:56:23.034730 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.034711 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.046684 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.046657 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9dc7d57f7-t2m9f"] Apr 16 19:56:23.136615 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.136577 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzd6x\" (UniqueName: \"kubernetes.io/projected/90c54830-b3ba-4066-9794-55e29cf3e406-kube-api-access-dzd6x\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.136779 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.136627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-service-ca\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.136779 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.136646 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-oauth-serving-cert\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.136779 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.136662 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-console-config\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.136779 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.136703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-trusted-ca-bundle\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.136779 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.136735 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-oauth-config\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.136779 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.136759 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-serving-cert\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.237959 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.237921 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-serving-cert\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.238166 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.237981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzd6x\" (UniqueName: \"kubernetes.io/projected/90c54830-b3ba-4066-9794-55e29cf3e406-kube-api-access-dzd6x\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.238166 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.238018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-service-ca\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.238166 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.238044 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-oauth-serving-cert\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.238166 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.238098 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-console-config\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.238357 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.238227 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-trusted-ca-bundle\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.238357 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.238316 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-oauth-config\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.238886 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.238854 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-service-ca\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.239009 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.238972 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-console-config\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.239009 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.238977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-oauth-serving-cert\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.239412 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.239390 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-trusted-ca-bundle\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.240555 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.240538 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-serving-cert\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.240652 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.240636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-oauth-config\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.247688 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.247668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzd6x\" (UniqueName: \"kubernetes.io/projected/90c54830-b3ba-4066-9794-55e29cf3e406-kube-api-access-dzd6x\") pod \"console-9dc7d57f7-t2m9f\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.344773 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.344687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:23.462397 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.462371 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9dc7d57f7-t2m9f"] Apr 16 19:56:23.464732 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:56:23.464700 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90c54830_b3ba_4066_9794_55e29cf3e406.slice/crio-163381fc96acc50c1d70766a7ebaf802678d36795afea5a969121f1a3e82113d WatchSource:0}: Error finding container 163381fc96acc50c1d70766a7ebaf802678d36795afea5a969121f1a3e82113d: Status 404 returned error can't find the container with id 163381fc96acc50c1d70766a7ebaf802678d36795afea5a969121f1a3e82113d Apr 16 19:56:23.795573 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.795530 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9dc7d57f7-t2m9f" event={"ID":"90c54830-b3ba-4066-9794-55e29cf3e406","Type":"ContainerStarted","Data":"a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea"} Apr 16 19:56:23.795573 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.795571 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9dc7d57f7-t2m9f" event={"ID":"90c54830-b3ba-4066-9794-55e29cf3e406","Type":"ContainerStarted","Data":"163381fc96acc50c1d70766a7ebaf802678d36795afea5a969121f1a3e82113d"} Apr 16 19:56:23.814358 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:23.814305 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9dc7d57f7-t2m9f" podStartSLOduration=0.814292345 podStartE2EDuration="814.292345ms" podCreationTimestamp="2026-04-16 19:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:23.81190957 +0000 UTC m=+147.346775607" watchObservedRunningTime="2026-04-16 19:56:23.814292345 +0000 UTC m=+147.349158381" Apr 16 19:56:33.345806 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:33.345766 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:33.345806 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:33.345808 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:33.350745 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:33.350718 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:33.828693 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:33.828665 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:56:33.877300 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:33.877268 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8595f7f78f-8hnwp"] Apr 16 19:56:58.897040 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:58.896981 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8595f7f78f-8hnwp" podUID="2b015b01-9339-426c-9bfa-195d80ec1a92" containerName="console" containerID="cri-o://c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f" gracePeriod=15 Apr 16 19:56:59.134929 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.134907 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8595f7f78f-8hnwp_2b015b01-9339-426c-9bfa-195d80ec1a92/console/0.log" Apr 16 19:56:59.135050 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.134969 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:56:59.244018 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.243930 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-service-ca\") pod \"2b015b01-9339-426c-9bfa-195d80ec1a92\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " Apr 16 19:56:59.244018 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.243984 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-oauth-config\") pod \"2b015b01-9339-426c-9bfa-195d80ec1a92\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " Apr 16 19:56:59.244018 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244011 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-serving-cert\") pod \"2b015b01-9339-426c-9bfa-195d80ec1a92\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " Apr 16 19:56:59.244316 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244144 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-trusted-ca-bundle\") pod \"2b015b01-9339-426c-9bfa-195d80ec1a92\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " Apr 16 19:56:59.244316 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244200 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnz6f\" (UniqueName: \"kubernetes.io/projected/2b015b01-9339-426c-9bfa-195d80ec1a92-kube-api-access-qnz6f\") pod \"2b015b01-9339-426c-9bfa-195d80ec1a92\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " Apr 16 19:56:59.244316 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244253 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-oauth-serving-cert\") pod \"2b015b01-9339-426c-9bfa-195d80ec1a92\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " Apr 16 19:56:59.244316 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244284 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-console-config\") pod \"2b015b01-9339-426c-9bfa-195d80ec1a92\" (UID: \"2b015b01-9339-426c-9bfa-195d80ec1a92\") " Apr 16 19:56:59.244582 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244440 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-service-ca" (OuterVolumeSpecName: "service-ca") pod "2b015b01-9339-426c-9bfa-195d80ec1a92" (UID: "2b015b01-9339-426c-9bfa-195d80ec1a92"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:59.244657 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244589 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2b015b01-9339-426c-9bfa-195d80ec1a92" (UID: "2b015b01-9339-426c-9bfa-195d80ec1a92"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:59.244657 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244599 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2b015b01-9339-426c-9bfa-195d80ec1a92" (UID: "2b015b01-9339-426c-9bfa-195d80ec1a92"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:59.244806 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.244784 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-console-config" (OuterVolumeSpecName: "console-config") pod "2b015b01-9339-426c-9bfa-195d80ec1a92" (UID: "2b015b01-9339-426c-9bfa-195d80ec1a92"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:59.246327 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.246303 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b015b01-9339-426c-9bfa-195d80ec1a92-kube-api-access-qnz6f" (OuterVolumeSpecName: "kube-api-access-qnz6f") pod "2b015b01-9339-426c-9bfa-195d80ec1a92" (UID: "2b015b01-9339-426c-9bfa-195d80ec1a92"). InnerVolumeSpecName "kube-api-access-qnz6f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:59.246583 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.246566 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2b015b01-9339-426c-9bfa-195d80ec1a92" (UID: "2b015b01-9339-426c-9bfa-195d80ec1a92"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:59.246627 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.246587 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2b015b01-9339-426c-9bfa-195d80ec1a92" (UID: "2b015b01-9339-426c-9bfa-195d80ec1a92"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:59.345323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.345286 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:59.345323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.345316 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-trusted-ca-bundle\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:59.345323 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.345329 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qnz6f\" (UniqueName: \"kubernetes.io/projected/2b015b01-9339-426c-9bfa-195d80ec1a92-kube-api-access-qnz6f\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:59.345546 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.345342 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-oauth-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:59.345546 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.345354 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-console-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:59.345546 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.345366 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b015b01-9339-426c-9bfa-195d80ec1a92-service-ca\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:59.345546 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.345377 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b015b01-9339-426c-9bfa-195d80ec1a92-console-oauth-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:56:59.899075 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.899037 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8595f7f78f-8hnwp_2b015b01-9339-426c-9bfa-195d80ec1a92/console/0.log" Apr 16 19:56:59.899471 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.899103 2570 generic.go:358] "Generic (PLEG): container finished" podID="2b015b01-9339-426c-9bfa-195d80ec1a92" containerID="c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f" exitCode=2 Apr 16 19:56:59.899471 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.899194 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8595f7f78f-8hnwp" event={"ID":"2b015b01-9339-426c-9bfa-195d80ec1a92","Type":"ContainerDied","Data":"c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f"} Apr 16 19:56:59.899471 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.899234 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8595f7f78f-8hnwp" event={"ID":"2b015b01-9339-426c-9bfa-195d80ec1a92","Type":"ContainerDied","Data":"63693ca9f7bea7f90c4a346bd9ff8b685902fdec2bc39f7d882556459d298af1"} Apr 16 19:56:59.899471 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.899250 2570 scope.go:117] "RemoveContainer" containerID="c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f" Apr 16 19:56:59.899471 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.899203 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8595f7f78f-8hnwp" Apr 16 19:56:59.907294 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.907272 2570 scope.go:117] "RemoveContainer" containerID="c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f" Apr 16 19:56:59.907564 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:56:59.907545 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f\": container with ID starting with c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f not found: ID does not exist" containerID="c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f" Apr 16 19:56:59.907630 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.907577 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f"} err="failed to get container status \"c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f\": rpc error: code = NotFound desc = could not find container \"c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f\": container with ID starting with c0c3240d5a3028bcb426ea21f8b7329215040570186d95564543a3e18a150e2f not found: ID does not exist" Apr 16 19:56:59.921069 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.921031 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8595f7f78f-8hnwp"] Apr 16 19:56:59.924424 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:56:59.924404 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8595f7f78f-8hnwp"] Apr 16 19:57:01.110821 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:01.110786 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b015b01-9339-426c-9bfa-195d80ec1a92" path="/var/lib/kubelet/pods/2b015b01-9339-426c-9bfa-195d80ec1a92/volumes" Apr 16 19:57:41.744793 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.744762 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7ddcc767f5-74npf"] Apr 16 19:57:41.745229 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.745119 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b015b01-9339-426c-9bfa-195d80ec1a92" containerName="console" Apr 16 19:57:41.745229 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.745133 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b015b01-9339-426c-9bfa-195d80ec1a92" containerName="console" Apr 16 19:57:41.745229 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.745184 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b015b01-9339-426c-9bfa-195d80ec1a92" containerName="console" Apr 16 19:57:41.748449 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.748429 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.758480 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.758456 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ddcc767f5-74npf"] Apr 16 19:57:41.792188 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.792148 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556z2\" (UniqueName: \"kubernetes.io/projected/7a9895a8-edbc-434f-9da4-900c567a1fa1-kube-api-access-556z2\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.792350 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.792209 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-trusted-ca-bundle\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.792350 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.792235 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-config\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.792350 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.792266 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-oauth-config\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.792350 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.792302 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-oauth-serving-cert\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.792503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.792366 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-service-ca\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.792503 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.792390 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-serving-cert\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.892800 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.892769 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-556z2\" (UniqueName: \"kubernetes.io/projected/7a9895a8-edbc-434f-9da4-900c567a1fa1-kube-api-access-556z2\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893007 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.892819 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-trusted-ca-bundle\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893007 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.892942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-config\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893007 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.892992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-oauth-config\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893140 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.893040 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-oauth-serving-cert\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893221 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.893192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-service-ca\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893271 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.893247 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-serving-cert\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893679 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.893651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-config\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893787 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.893725 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-trusted-ca-bundle\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893787 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.893728 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-oauth-serving-cert\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.893871 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.893785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-service-ca\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.895455 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.895431 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-oauth-config\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.895593 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.895561 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-serving-cert\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:41.900806 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:41.900785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-556z2\" (UniqueName: \"kubernetes.io/projected/7a9895a8-edbc-434f-9da4-900c567a1fa1-kube-api-access-556z2\") pod \"console-7ddcc767f5-74npf\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:42.058034 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:42.058006 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:42.176264 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:42.176227 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ddcc767f5-74npf"] Apr 16 19:57:42.180094 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:57:42.180049 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a9895a8_edbc_434f_9da4_900c567a1fa1.slice/crio-01ef11ab0b6ef29f57c437252a4e77ee08deda455c5f858b3c010839fa6904b5 WatchSource:0}: Error finding container 01ef11ab0b6ef29f57c437252a4e77ee08deda455c5f858b3c010839fa6904b5: Status 404 returned error can't find the container with id 01ef11ab0b6ef29f57c437252a4e77ee08deda455c5f858b3c010839fa6904b5 Apr 16 19:57:43.023950 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:43.023912 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ddcc767f5-74npf" event={"ID":"7a9895a8-edbc-434f-9da4-900c567a1fa1","Type":"ContainerStarted","Data":"c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb"} Apr 16 19:57:43.023950 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:43.023952 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ddcc767f5-74npf" event={"ID":"7a9895a8-edbc-434f-9da4-900c567a1fa1","Type":"ContainerStarted","Data":"01ef11ab0b6ef29f57c437252a4e77ee08deda455c5f858b3c010839fa6904b5"} Apr 16 19:57:43.042502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:43.042452 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7ddcc767f5-74npf" podStartSLOduration=2.042436722 podStartE2EDuration="2.042436722s" podCreationTimestamp="2026-04-16 19:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:57:43.040447291 +0000 UTC m=+226.575313348" watchObservedRunningTime="2026-04-16 19:57:43.042436722 +0000 UTC m=+226.577302758" Apr 16 19:57:52.058130 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:52.058097 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:52.058130 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:52.058137 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:52.062959 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:52.062934 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:53.057457 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:53.057431 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 19:57:53.114302 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:57:53.114270 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9dc7d57f7-t2m9f"] Apr 16 19:58:02.942957 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:02.942925 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pcrsp"] Apr 16 19:58:02.947320 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:02.947300 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:02.949815 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:02.949798 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:58:02.953729 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:02.953697 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pcrsp"] Apr 16 19:58:03.065324 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.065294 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b2defe9-7ceb-4351-8846-5d2d737476d8-dbus\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.065478 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.065352 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b2defe9-7ceb-4351-8846-5d2d737476d8-kubelet-config\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.065478 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.065383 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2defe9-7ceb-4351-8846-5d2d737476d8-original-pull-secret\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.166314 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.166276 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b2defe9-7ceb-4351-8846-5d2d737476d8-kubelet-config\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.166502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.166319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2defe9-7ceb-4351-8846-5d2d737476d8-original-pull-secret\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.166502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.166402 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b2defe9-7ceb-4351-8846-5d2d737476d8-dbus\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.166502 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.166411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b2defe9-7ceb-4351-8846-5d2d737476d8-kubelet-config\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.166656 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.166601 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b2defe9-7ceb-4351-8846-5d2d737476d8-dbus\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.168564 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.168546 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2defe9-7ceb-4351-8846-5d2d737476d8-original-pull-secret\") pod \"global-pull-secret-syncer-pcrsp\" (UID: \"8b2defe9-7ceb-4351-8846-5d2d737476d8\") " pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.256975 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.256945 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pcrsp" Apr 16 19:58:03.375171 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:03.375002 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pcrsp"] Apr 16 19:58:03.377825 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:58:03.377795 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2defe9_7ceb_4351_8846_5d2d737476d8.slice/crio-d73f99d3d4823a42a2855e6ac261ac7621279bfeb3480b549e414978c07c1986 WatchSource:0}: Error finding container d73f99d3d4823a42a2855e6ac261ac7621279bfeb3480b549e414978c07c1986: Status 404 returned error can't find the container with id d73f99d3d4823a42a2855e6ac261ac7621279bfeb3480b549e414978c07c1986 Apr 16 19:58:04.085715 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:04.085673 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pcrsp" event={"ID":"8b2defe9-7ceb-4351-8846-5d2d737476d8","Type":"ContainerStarted","Data":"d73f99d3d4823a42a2855e6ac261ac7621279bfeb3480b549e414978c07c1986"} Apr 16 19:58:08.104501 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:08.104464 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pcrsp" event={"ID":"8b2defe9-7ceb-4351-8846-5d2d737476d8","Type":"ContainerStarted","Data":"37c858f05fdc8194ed06dd441f59149bc2061c54e11c5c78a4a54ffde6efe6fa"} Apr 16 19:58:08.121273 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:08.121225 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pcrsp" podStartSLOduration=2.486356605 podStartE2EDuration="6.121209499s" podCreationTimestamp="2026-04-16 19:58:02 +0000 UTC" firstStartedPulling="2026-04-16 19:58:03.379427141 +0000 UTC m=+246.914293154" lastFinishedPulling="2026-04-16 19:58:07.014280034 +0000 UTC m=+250.549146048" observedRunningTime="2026-04-16 19:58:08.119379589 +0000 UTC m=+251.654245626" watchObservedRunningTime="2026-04-16 19:58:08.121209499 +0000 UTC m=+251.656075536" Apr 16 19:58:18.133804 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.133722 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9dc7d57f7-t2m9f" podUID="90c54830-b3ba-4066-9794-55e29cf3e406" containerName="console" containerID="cri-o://a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea" gracePeriod=15 Apr 16 19:58:18.375981 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.375957 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9dc7d57f7-t2m9f_90c54830-b3ba-4066-9794-55e29cf3e406/console/0.log" Apr 16 19:58:18.376113 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.376040 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:58:18.504034 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.503987 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-service-ca\") pod \"90c54830-b3ba-4066-9794-55e29cf3e406\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " Apr 16 19:58:18.504247 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504120 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-oauth-config\") pod \"90c54830-b3ba-4066-9794-55e29cf3e406\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " Apr 16 19:58:18.504247 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504152 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-trusted-ca-bundle\") pod \"90c54830-b3ba-4066-9794-55e29cf3e406\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " Apr 16 19:58:18.504247 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504179 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-oauth-serving-cert\") pod \"90c54830-b3ba-4066-9794-55e29cf3e406\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " Apr 16 19:58:18.504247 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504204 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-serving-cert\") pod \"90c54830-b3ba-4066-9794-55e29cf3e406\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " Apr 16 19:58:18.504439 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504261 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzd6x\" (UniqueName: \"kubernetes.io/projected/90c54830-b3ba-4066-9794-55e29cf3e406-kube-api-access-dzd6x\") pod \"90c54830-b3ba-4066-9794-55e29cf3e406\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " Apr 16 19:58:18.504439 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504294 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-console-config\") pod \"90c54830-b3ba-4066-9794-55e29cf3e406\" (UID: \"90c54830-b3ba-4066-9794-55e29cf3e406\") " Apr 16 19:58:18.504550 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504516 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-service-ca" (OuterVolumeSpecName: "service-ca") pod "90c54830-b3ba-4066-9794-55e29cf3e406" (UID: "90c54830-b3ba-4066-9794-55e29cf3e406"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:58:18.504608 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504567 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "90c54830-b3ba-4066-9794-55e29cf3e406" (UID: "90c54830-b3ba-4066-9794-55e29cf3e406"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:58:18.504665 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504616 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "90c54830-b3ba-4066-9794-55e29cf3e406" (UID: "90c54830-b3ba-4066-9794-55e29cf3e406"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:58:18.504874 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.504845 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-console-config" (OuterVolumeSpecName: "console-config") pod "90c54830-b3ba-4066-9794-55e29cf3e406" (UID: "90c54830-b3ba-4066-9794-55e29cf3e406"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:58:18.506540 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.506510 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "90c54830-b3ba-4066-9794-55e29cf3e406" (UID: "90c54830-b3ba-4066-9794-55e29cf3e406"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:58:18.506631 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.506554 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c54830-b3ba-4066-9794-55e29cf3e406-kube-api-access-dzd6x" (OuterVolumeSpecName: "kube-api-access-dzd6x") pod "90c54830-b3ba-4066-9794-55e29cf3e406" (UID: "90c54830-b3ba-4066-9794-55e29cf3e406"). InnerVolumeSpecName "kube-api-access-dzd6x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:58:18.506631 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.506554 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "90c54830-b3ba-4066-9794-55e29cf3e406" (UID: "90c54830-b3ba-4066-9794-55e29cf3e406"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:58:18.605437 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.605399 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzd6x\" (UniqueName: \"kubernetes.io/projected/90c54830-b3ba-4066-9794-55e29cf3e406-kube-api-access-dzd6x\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:18.605437 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.605432 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-console-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:18.605437 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.605442 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-service-ca\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:18.605661 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.605452 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-oauth-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:18.605661 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.605461 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-trusted-ca-bundle\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:18.605661 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.605471 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90c54830-b3ba-4066-9794-55e29cf3e406-oauth-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:18.605661 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:18.605481 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90c54830-b3ba-4066-9794-55e29cf3e406-console-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:19.135660 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.135637 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9dc7d57f7-t2m9f_90c54830-b3ba-4066-9794-55e29cf3e406/console/0.log" Apr 16 19:58:19.136118 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.135682 2570 generic.go:358] "Generic (PLEG): container finished" podID="90c54830-b3ba-4066-9794-55e29cf3e406" containerID="a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea" exitCode=2 Apr 16 19:58:19.136118 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.135785 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9dc7d57f7-t2m9f" Apr 16 19:58:19.136118 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.135784 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9dc7d57f7-t2m9f" event={"ID":"90c54830-b3ba-4066-9794-55e29cf3e406","Type":"ContainerDied","Data":"a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea"} Apr 16 19:58:19.136118 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.135835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9dc7d57f7-t2m9f" event={"ID":"90c54830-b3ba-4066-9794-55e29cf3e406","Type":"ContainerDied","Data":"163381fc96acc50c1d70766a7ebaf802678d36795afea5a969121f1a3e82113d"} Apr 16 19:58:19.136118 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.135856 2570 scope.go:117] "RemoveContainer" containerID="a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea" Apr 16 19:58:19.143638 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.143618 2570 scope.go:117] "RemoveContainer" containerID="a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea" Apr 16 19:58:19.143912 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:19.143890 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea\": container with ID starting with a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea not found: ID does not exist" containerID="a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea" Apr 16 19:58:19.144066 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.143919 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea"} err="failed to get container status \"a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea\": rpc error: code = NotFound desc = could not find container \"a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea\": container with ID starting with a74b05e986d79f680e5fad1aac7f1a0302e213ece9e04518b6de8ccf0eb501ea not found: ID does not exist" Apr 16 19:58:19.167130 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.167100 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9dc7d57f7-t2m9f"] Apr 16 19:58:19.171717 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:19.171692 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9dc7d57f7-t2m9f"] Apr 16 19:58:21.111627 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.111593 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c54830-b3ba-4066-9794-55e29cf3e406" path="/var/lib/kubelet/pods/90c54830-b3ba-4066-9794-55e29cf3e406/volumes" Apr 16 19:58:21.253866 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.253833 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22"] Apr 16 19:58:21.254171 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.254159 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90c54830-b3ba-4066-9794-55e29cf3e406" containerName="console" Apr 16 19:58:21.254221 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.254172 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c54830-b3ba-4066-9794-55e29cf3e406" containerName="console" Apr 16 19:58:21.254255 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.254245 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="90c54830-b3ba-4066-9794-55e29cf3e406" containerName="console" Apr 16 19:58:21.258485 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.258464 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.261154 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.261133 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:58:21.261264 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.261253 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:58:21.262347 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.262333 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-j7bkk\"" Apr 16 19:58:21.266029 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.265716 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22"] Apr 16 19:58:21.427209 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.427109 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.427209 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.427174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2tkn\" (UniqueName: \"kubernetes.io/projected/02c332bb-1b23-4ff1-9596-35db4adf1c64-kube-api-access-z2tkn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.427392 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.427288 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.528036 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.527996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2tkn\" (UniqueName: \"kubernetes.io/projected/02c332bb-1b23-4ff1-9596-35db4adf1c64-kube-api-access-z2tkn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.528215 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.528108 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.528215 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.528160 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.528549 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.528525 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.528632 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.528549 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.536311 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.536278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2tkn\" (UniqueName: \"kubernetes.io/projected/02c332bb-1b23-4ff1-9596-35db4adf1c64-kube-api-access-z2tkn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.568264 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.568226 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:21.688371 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:21.688297 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22"] Apr 16 19:58:21.691262 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:58:21.691229 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c332bb_1b23_4ff1_9596_35db4adf1c64.slice/crio-3b00aa331b549b2d42e1b69ba72de06596800299703b05679df541a54232e4ab WatchSource:0}: Error finding container 3b00aa331b549b2d42e1b69ba72de06596800299703b05679df541a54232e4ab: Status 404 returned error can't find the container with id 3b00aa331b549b2d42e1b69ba72de06596800299703b05679df541a54232e4ab Apr 16 19:58:22.147441 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:22.147404 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" event={"ID":"02c332bb-1b23-4ff1-9596-35db4adf1c64","Type":"ContainerStarted","Data":"3b00aa331b549b2d42e1b69ba72de06596800299703b05679df541a54232e4ab"} Apr 16 19:58:27.165326 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:27.165291 2570 generic.go:358] "Generic (PLEG): container finished" podID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerID="ee9a250b274e89b87d2b3d7e0bfe9db03fc9fc0e15a8fe17dea7e9a2425b19df" exitCode=0 Apr 16 19:58:27.165741 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:27.165343 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" event={"ID":"02c332bb-1b23-4ff1-9596-35db4adf1c64","Type":"ContainerDied","Data":"ee9a250b274e89b87d2b3d7e0bfe9db03fc9fc0e15a8fe17dea7e9a2425b19df"} Apr 16 19:58:30.175258 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:30.175212 2570 generic.go:358] "Generic (PLEG): container finished" podID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerID="5950db5b117c7e50e7fee2f3b305765f7b269ddb9206994f3c1b0c793fdf792e" exitCode=0 Apr 16 19:58:30.175621 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:30.175295 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" event={"ID":"02c332bb-1b23-4ff1-9596-35db4adf1c64","Type":"ContainerDied","Data":"5950db5b117c7e50e7fee2f3b305765f7b269ddb9206994f3c1b0c793fdf792e"} Apr 16 19:58:37.202921 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:37.202884 2570 generic.go:358] "Generic (PLEG): container finished" podID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerID="9a699d92122fc0a637c780d21fc9bb6e12793c61bfed02608e80b4a41fcac8a6" exitCode=0 Apr 16 19:58:37.203351 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:37.202966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" event={"ID":"02c332bb-1b23-4ff1-9596-35db4adf1c64","Type":"ContainerDied","Data":"9a699d92122fc0a637c780d21fc9bb6e12793c61bfed02608e80b4a41fcac8a6"} Apr 16 19:58:38.325850 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.325828 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:38.368505 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.368446 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2tkn\" (UniqueName: \"kubernetes.io/projected/02c332bb-1b23-4ff1-9596-35db4adf1c64-kube-api-access-z2tkn\") pod \"02c332bb-1b23-4ff1-9596-35db4adf1c64\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " Apr 16 19:58:38.368505 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.368486 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-util\") pod \"02c332bb-1b23-4ff1-9596-35db4adf1c64\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " Apr 16 19:58:38.368749 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.368526 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-bundle\") pod \"02c332bb-1b23-4ff1-9596-35db4adf1c64\" (UID: \"02c332bb-1b23-4ff1-9596-35db4adf1c64\") " Apr 16 19:58:38.369170 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.369140 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-bundle" (OuterVolumeSpecName: "bundle") pod "02c332bb-1b23-4ff1-9596-35db4adf1c64" (UID: "02c332bb-1b23-4ff1-9596-35db4adf1c64"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:58:38.371849 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.371819 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c332bb-1b23-4ff1-9596-35db4adf1c64-kube-api-access-z2tkn" (OuterVolumeSpecName: "kube-api-access-z2tkn") pod "02c332bb-1b23-4ff1-9596-35db4adf1c64" (UID: "02c332bb-1b23-4ff1-9596-35db4adf1c64"). InnerVolumeSpecName "kube-api-access-z2tkn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:58:38.373394 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.373373 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-util" (OuterVolumeSpecName: "util") pod "02c332bb-1b23-4ff1-9596-35db4adf1c64" (UID: "02c332bb-1b23-4ff1-9596-35db4adf1c64"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:58:38.469976 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.469889 2570 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-bundle\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:38.469976 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.469918 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z2tkn\" (UniqueName: \"kubernetes.io/projected/02c332bb-1b23-4ff1-9596-35db4adf1c64-kube-api-access-z2tkn\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:38.469976 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:38.469930 2570 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02c332bb-1b23-4ff1-9596-35db4adf1c64-util\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 19:58:39.210820 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:39.210787 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" event={"ID":"02c332bb-1b23-4ff1-9596-35db4adf1c64","Type":"ContainerDied","Data":"3b00aa331b549b2d42e1b69ba72de06596800299703b05679df541a54232e4ab"} Apr 16 19:58:39.210820 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:39.210821 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b00aa331b549b2d42e1b69ba72de06596800299703b05679df541a54232e4ab" Apr 16 19:58:39.210820 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:39.210797 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwzd22" Apr 16 19:58:43.323594 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.323546 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql"] Apr 16 19:58:43.324080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.323880 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerName="util" Apr 16 19:58:43.324080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.323891 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerName="util" Apr 16 19:58:43.324080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.323906 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerName="pull" Apr 16 19:58:43.324080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.323911 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerName="pull" Apr 16 19:58:43.324080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.323919 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerName="extract" Apr 16 19:58:43.324080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.323925 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerName="extract" Apr 16 19:58:43.324080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.323985 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="02c332bb-1b23-4ff1-9596-35db4adf1c64" containerName="extract" Apr 16 19:58:43.377903 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.377866 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql"] Apr 16 19:58:43.378076 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.378008 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:43.380816 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.380781 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-rqzsv\"" Apr 16 19:58:43.380816 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.380781 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 19:58:43.381012 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.380819 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 19:58:43.381012 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.380799 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 19:58:43.405958 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.405920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/46a389b7-c70f-4215-8a5e-b855b0e29869-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql\" (UID: \"46a389b7-c70f-4215-8a5e-b855b0e29869\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:43.406181 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.406026 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvmg\" (UniqueName: \"kubernetes.io/projected/46a389b7-c70f-4215-8a5e-b855b0e29869-kube-api-access-spvmg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql\" (UID: \"46a389b7-c70f-4215-8a5e-b855b0e29869\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:43.506666 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.506620 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spvmg\" (UniqueName: \"kubernetes.io/projected/46a389b7-c70f-4215-8a5e-b855b0e29869-kube-api-access-spvmg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql\" (UID: \"46a389b7-c70f-4215-8a5e-b855b0e29869\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:43.506868 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.506717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/46a389b7-c70f-4215-8a5e-b855b0e29869-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql\" (UID: \"46a389b7-c70f-4215-8a5e-b855b0e29869\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:43.509094 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.509040 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/46a389b7-c70f-4215-8a5e-b855b0e29869-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql\" (UID: \"46a389b7-c70f-4215-8a5e-b855b0e29869\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:43.516035 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.515999 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvmg\" (UniqueName: \"kubernetes.io/projected/46a389b7-c70f-4215-8a5e-b855b0e29869-kube-api-access-spvmg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql\" (UID: \"46a389b7-c70f-4215-8a5e-b855b0e29869\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:43.688177 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.688080 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:43.813969 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:43.813940 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql"] Apr 16 19:58:43.816492 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:58:43.816461 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46a389b7_c70f_4215_8a5e_b855b0e29869.slice/crio-00c8923485f649ceee8310d1bd2f377d4cb73848defbb3c3bc59514d8b13b137 WatchSource:0}: Error finding container 00c8923485f649ceee8310d1bd2f377d4cb73848defbb3c3bc59514d8b13b137: Status 404 returned error can't find the container with id 00c8923485f649ceee8310d1bd2f377d4cb73848defbb3c3bc59514d8b13b137 Apr 16 19:58:44.227133 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:44.227093 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" event={"ID":"46a389b7-c70f-4215-8a5e-b855b0e29869","Type":"ContainerStarted","Data":"00c8923485f649ceee8310d1bd2f377d4cb73848defbb3c3bc59514d8b13b137"} Apr 16 19:58:47.240698 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.240604 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" event={"ID":"46a389b7-c70f-4215-8a5e-b855b0e29869","Type":"ContainerStarted","Data":"b424346dc98393e49e1ac809ca3062acc654dfe518406cd1e47818d265ebb43d"} Apr 16 19:58:47.240698 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.240659 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:58:47.486679 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.486620 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" podStartSLOduration=1.5535187430000001 podStartE2EDuration="4.486598669s" podCreationTimestamp="2026-04-16 19:58:43 +0000 UTC" firstStartedPulling="2026-04-16 19:58:43.818231374 +0000 UTC m=+287.353097388" lastFinishedPulling="2026-04-16 19:58:46.751311299 +0000 UTC m=+290.286177314" observedRunningTime="2026-04-16 19:58:47.290917564 +0000 UTC m=+290.825783599" watchObservedRunningTime="2026-04-16 19:58:47.486598669 +0000 UTC m=+291.021464705" Apr 16 19:58:47.486865 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.486777 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-khbhz"] Apr 16 19:58:47.490252 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.490233 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.494246 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.494189 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 19:58:47.494404 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.494389 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 19:58:47.494521 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.494501 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5mtwl\"" Apr 16 19:58:47.499414 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.499390 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-khbhz"] Apr 16 19:58:47.543383 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.543346 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6xf\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-kube-api-access-vf6xf\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.543556 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.543393 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/aeb41d46-8dcd-43d9-acab-5b2057669c70-cabundle0\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.543556 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.543512 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.643902 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.643860 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.644141 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.643920 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6xf\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-kube-api-access-vf6xf\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.644141 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.643961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/aeb41d46-8dcd-43d9-acab-5b2057669c70-cabundle0\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.644141 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.644042 2570 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 19:58:47.644141 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.644081 2570 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:58:47.644141 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.644093 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:58:47.644141 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.644110 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-khbhz: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 19:58:47.644459 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.644172 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates podName:aeb41d46-8dcd-43d9-acab-5b2057669c70 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:48.14415161 +0000 UTC m=+291.679017637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates") pod "keda-operator-ffbb595cb-khbhz" (UID: "aeb41d46-8dcd-43d9-acab-5b2057669c70") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 19:58:47.644710 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.644686 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/aeb41d46-8dcd-43d9-acab-5b2057669c70-cabundle0\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.658530 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.658497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6xf\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-kube-api-access-vf6xf\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:47.718458 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.718419 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c"] Apr 16 19:58:47.721981 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.721962 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:47.724826 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.724803 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 19:58:47.732120 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.732090 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c"] Apr 16 19:58:47.744814 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.744738 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:47.744814 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.744774 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13591437-6ce6-4491-8de7-8de01039e09a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:47.745037 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.744861 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzr7\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-kube-api-access-sjzr7\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:47.845380 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.845334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:47.845380 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.845383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13591437-6ce6-4491-8de7-8de01039e09a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:47.845642 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.845452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzr7\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-kube-api-access-sjzr7\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:47.845642 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.845498 2570 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:58:47.845642 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.845519 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:58:47.845642 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.845541 2570 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 19:58:47.845642 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.845564 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 19:58:47.845949 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:47.845660 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates podName:13591437-6ce6-4491-8de7-8de01039e09a nodeName:}" failed. No retries permitted until 2026-04-16 19:58:48.345639658 +0000 UTC m=+291.880505681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates") pod "keda-metrics-apiserver-7c9f485588-r8h6c" (UID: "13591437-6ce6-4491-8de7-8de01039e09a") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 19:58:47.845949 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.845816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13591437-6ce6-4491-8de7-8de01039e09a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:47.855184 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:47.855159 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzr7\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-kube-api-access-sjzr7\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:48.036970 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.036928 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-rdjzq"] Apr 16 19:58:48.040772 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.040748 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:48.043448 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.043426 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 19:58:48.051343 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.051300 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-rdjzq"] Apr 16 19:58:48.148351 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.148301 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wxc8\" (UniqueName: \"kubernetes.io/projected/d8361fdd-c337-4c48-ab41-adfcb39ac9ed-kube-api-access-9wxc8\") pod \"keda-admission-cf49989db-rdjzq\" (UID: \"d8361fdd-c337-4c48-ab41-adfcb39ac9ed\") " pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:48.148553 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.148412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:48.148553 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.148505 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d8361fdd-c337-4c48-ab41-adfcb39ac9ed-certificates\") pod \"keda-admission-cf49989db-rdjzq\" (UID: \"d8361fdd-c337-4c48-ab41-adfcb39ac9ed\") " pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:48.148786 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:48.148759 2570 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:58:48.148867 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:48.148790 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:58:48.148867 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:48.148802 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-khbhz: references non-existent secret key: ca.crt Apr 16 19:58:48.148867 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:48.148852 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates podName:aeb41d46-8dcd-43d9-acab-5b2057669c70 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:49.148834367 +0000 UTC m=+292.683700398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates") pod "keda-operator-ffbb595cb-khbhz" (UID: "aeb41d46-8dcd-43d9-acab-5b2057669c70") : references non-existent secret key: ca.crt Apr 16 19:58:48.249252 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.249219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d8361fdd-c337-4c48-ab41-adfcb39ac9ed-certificates\") pod \"keda-admission-cf49989db-rdjzq\" (UID: \"d8361fdd-c337-4c48-ab41-adfcb39ac9ed\") " pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:48.249720 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.249498 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wxc8\" (UniqueName: \"kubernetes.io/projected/d8361fdd-c337-4c48-ab41-adfcb39ac9ed-kube-api-access-9wxc8\") pod \"keda-admission-cf49989db-rdjzq\" (UID: \"d8361fdd-c337-4c48-ab41-adfcb39ac9ed\") " pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:48.251923 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.251895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d8361fdd-c337-4c48-ab41-adfcb39ac9ed-certificates\") pod \"keda-admission-cf49989db-rdjzq\" (UID: \"d8361fdd-c337-4c48-ab41-adfcb39ac9ed\") " pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:48.298016 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.297934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wxc8\" (UniqueName: \"kubernetes.io/projected/d8361fdd-c337-4c48-ab41-adfcb39ac9ed-kube-api-access-9wxc8\") pod \"keda-admission-cf49989db-rdjzq\" (UID: \"d8361fdd-c337-4c48-ab41-adfcb39ac9ed\") " pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:48.350406 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.350360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:48.350579 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:48.350544 2570 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:58:48.350579 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:48.350570 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:58:48.350656 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:48.350593 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c: references non-existent secret key: tls.crt Apr 16 19:58:48.350656 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:48.350654 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates podName:13591437-6ce6-4491-8de7-8de01039e09a nodeName:}" failed. No retries permitted until 2026-04-16 19:58:49.350635145 +0000 UTC m=+292.885501164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates") pod "keda-metrics-apiserver-7c9f485588-r8h6c" (UID: "13591437-6ce6-4491-8de7-8de01039e09a") : references non-existent secret key: tls.crt Apr 16 19:58:48.354080 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.354043 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:48.508922 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:48.508881 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-rdjzq"] Apr 16 19:58:48.512284 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:58:48.512240 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8361fdd_c337_4c48_ab41_adfcb39ac9ed.slice/crio-a991b64db88274584ca20d9d2a20ddcc429aa3cab8d37a225ebb45f9b91a0e25 WatchSource:0}: Error finding container a991b64db88274584ca20d9d2a20ddcc429aa3cab8d37a225ebb45f9b91a0e25: Status 404 returned error can't find the container with id a991b64db88274584ca20d9d2a20ddcc429aa3cab8d37a225ebb45f9b91a0e25 Apr 16 19:58:49.159097 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:49.159045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:49.159315 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:49.159228 2570 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:58:49.159315 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:49.159248 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:58:49.159315 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:49.159259 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-khbhz: references non-existent secret key: ca.crt Apr 16 19:58:49.159487 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:49.159324 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates podName:aeb41d46-8dcd-43d9-acab-5b2057669c70 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:51.159306641 +0000 UTC m=+294.694172655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates") pod "keda-operator-ffbb595cb-khbhz" (UID: "aeb41d46-8dcd-43d9-acab-5b2057669c70") : references non-existent secret key: ca.crt Apr 16 19:58:49.249307 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:49.249271 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-rdjzq" event={"ID":"d8361fdd-c337-4c48-ab41-adfcb39ac9ed","Type":"ContainerStarted","Data":"a991b64db88274584ca20d9d2a20ddcc429aa3cab8d37a225ebb45f9b91a0e25"} Apr 16 19:58:49.360483 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:49.360439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:49.360666 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:49.360603 2570 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:58:49.360666 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:49.360621 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:58:49.360666 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:49.360644 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c: references non-existent secret key: tls.crt Apr 16 19:58:49.360781 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:49.360706 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates podName:13591437-6ce6-4491-8de7-8de01039e09a nodeName:}" failed. No retries permitted until 2026-04-16 19:58:51.360691314 +0000 UTC m=+294.895557329 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates") pod "keda-metrics-apiserver-7c9f485588-r8h6c" (UID: "13591437-6ce6-4491-8de7-8de01039e09a") : references non-existent secret key: tls.crt Apr 16 19:58:50.253965 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:50.253929 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-rdjzq" event={"ID":"d8361fdd-c337-4c48-ab41-adfcb39ac9ed","Type":"ContainerStarted","Data":"464d61dd92b4c6971ca568184f09bc6bdaece1aa99c1a71a94eb9a6351c96d29"} Apr 16 19:58:50.254358 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:50.254050 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:58:50.289563 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:50.289464 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-rdjzq" podStartSLOduration=0.779504313 podStartE2EDuration="2.289448076s" podCreationTimestamp="2026-04-16 19:58:48 +0000 UTC" firstStartedPulling="2026-04-16 19:58:48.515083759 +0000 UTC m=+292.049949774" lastFinishedPulling="2026-04-16 19:58:50.02502752 +0000 UTC m=+293.559893537" observedRunningTime="2026-04-16 19:58:50.287638537 +0000 UTC m=+293.822504584" watchObservedRunningTime="2026-04-16 19:58:50.289448076 +0000 UTC m=+293.824314112" Apr 16 19:58:51.175782 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:51.175741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:51.175980 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:51.175895 2570 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:58:51.175980 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:51.175921 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:58:51.175980 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:51.175933 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-khbhz: references non-existent secret key: ca.crt Apr 16 19:58:51.176202 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:51.176013 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates podName:aeb41d46-8dcd-43d9-acab-5b2057669c70 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:55.175991835 +0000 UTC m=+298.710857851 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates") pod "keda-operator-ffbb595cb-khbhz" (UID: "aeb41d46-8dcd-43d9-acab-5b2057669c70") : references non-existent secret key: ca.crt Apr 16 19:58:51.377553 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:51.377516 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:51.377914 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:51.377660 2570 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:58:51.377914 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:51.377679 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:58:51.377914 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:51.377698 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c: references non-existent secret key: tls.crt Apr 16 19:58:51.377914 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:58:51.377753 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates podName:13591437-6ce6-4491-8de7-8de01039e09a nodeName:}" failed. No retries permitted until 2026-04-16 19:58:55.377735837 +0000 UTC m=+298.912601878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates") pod "keda-metrics-apiserver-7c9f485588-r8h6c" (UID: "13591437-6ce6-4491-8de7-8de01039e09a") : references non-existent secret key: tls.crt Apr 16 19:58:55.212637 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:55.212591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:55.214962 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:55.214936 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aeb41d46-8dcd-43d9-acab-5b2057669c70-certificates\") pod \"keda-operator-ffbb595cb-khbhz\" (UID: \"aeb41d46-8dcd-43d9-acab-5b2057669c70\") " pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:55.300550 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:55.300489 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:58:55.414901 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:55.414866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:55.417266 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:55.417236 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13591437-6ce6-4491-8de7-8de01039e09a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8h6c\" (UID: \"13591437-6ce6-4491-8de7-8de01039e09a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:55.425210 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:55.425189 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-khbhz"] Apr 16 19:58:55.427985 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:58:55.427938 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeb41d46_8dcd_43d9_acab_5b2057669c70.slice/crio-51b9c62061a66d3b200672f2c735e9d1fa9920992a2f1af484b35100d80573bd WatchSource:0}: Error finding container 51b9c62061a66d3b200672f2c735e9d1fa9920992a2f1af484b35100d80573bd: Status 404 returned error can't find the container with id 51b9c62061a66d3b200672f2c735e9d1fa9920992a2f1af484b35100d80573bd Apr 16 19:58:55.533475 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:55.533447 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:58:55.653429 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:55.653280 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c"] Apr 16 19:58:55.655876 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:58:55.655849 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13591437_6ce6_4491_8de7_8de01039e09a.slice/crio-1d1e0af858a207f6533d6d14ee2d6297e91d81bcf0544fb0755083728c72f2f1 WatchSource:0}: Error finding container 1d1e0af858a207f6533d6d14ee2d6297e91d81bcf0544fb0755083728c72f2f1: Status 404 returned error can't find the container with id 1d1e0af858a207f6533d6d14ee2d6297e91d81bcf0544fb0755083728c72f2f1 Apr 16 19:58:56.276262 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:56.276220 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-khbhz" event={"ID":"aeb41d46-8dcd-43d9-acab-5b2057669c70","Type":"ContainerStarted","Data":"51b9c62061a66d3b200672f2c735e9d1fa9920992a2f1af484b35100d80573bd"} Apr 16 19:58:56.277439 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:56.277407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" event={"ID":"13591437-6ce6-4491-8de7-8de01039e09a","Type":"ContainerStarted","Data":"1d1e0af858a207f6533d6d14ee2d6297e91d81bcf0544fb0755083728c72f2f1"} Apr 16 19:58:56.992343 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:58:56.992320 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:59:00.292646 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:00.292605 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" event={"ID":"13591437-6ce6-4491-8de7-8de01039e09a","Type":"ContainerStarted","Data":"ed4bcfea45663e5758d9043ce5d2e78a0ce1b3ce1f8eeccbf860099c04efb5d0"} Apr 16 19:59:00.292646 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:00.292651 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:59:00.293926 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:00.293892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-khbhz" event={"ID":"aeb41d46-8dcd-43d9-acab-5b2057669c70","Type":"ContainerStarted","Data":"06ea241d2b6c39f6d2987cebc3d2bca7d95d6b7209bb301914403b206123074a"} Apr 16 19:59:00.294086 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:00.294048 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:59:00.310428 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:00.310377 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" podStartSLOduration=9.646522236 podStartE2EDuration="13.310362177s" podCreationTimestamp="2026-04-16 19:58:47 +0000 UTC" firstStartedPulling="2026-04-16 19:58:55.657229216 +0000 UTC m=+299.192095234" lastFinishedPulling="2026-04-16 19:58:59.321069161 +0000 UTC m=+302.855935175" observedRunningTime="2026-04-16 19:59:00.309711791 +0000 UTC m=+303.844578030" watchObservedRunningTime="2026-04-16 19:59:00.310362177 +0000 UTC m=+303.845228213" Apr 16 19:59:00.329608 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:00.329557 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-khbhz" podStartSLOduration=9.435860011 podStartE2EDuration="13.329543665s" podCreationTimestamp="2026-04-16 19:58:47 +0000 UTC" firstStartedPulling="2026-04-16 19:58:55.431639958 +0000 UTC m=+298.966505973" lastFinishedPulling="2026-04-16 19:58:59.325323608 +0000 UTC m=+302.860189627" observedRunningTime="2026-04-16 19:59:00.327341862 +0000 UTC m=+303.862207908" watchObservedRunningTime="2026-04-16 19:59:00.329543665 +0000 UTC m=+303.864409701" Apr 16 19:59:08.247048 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:08.247016 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vs4ql" Apr 16 19:59:11.260611 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:11.260582 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-rdjzq" Apr 16 19:59:11.303374 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:11.303345 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8h6c" Apr 16 19:59:21.300011 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:21.299965 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-khbhz" Apr 16 19:59:56.828690 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.828608 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-wt7jq"] Apr 16 19:59:56.832262 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.832241 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:56.838120 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.838031 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 19:59:56.839464 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.839442 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 19:59:56.839582 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.839520 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-drflm\"" Apr 16 19:59:56.839644 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.839601 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 19:59:56.841211 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.841187 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8"] Apr 16 19:59:56.844378 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.844360 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 19:59:56.847074 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.847033 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 19:59:56.847074 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.847044 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-dxp5x\"" Apr 16 19:59:56.849410 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.849386 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-wt7jq"] Apr 16 19:59:56.860728 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.860702 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8"] Apr 16 19:59:56.922748 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.922716 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert\") pod \"kserve-controller-manager-659c8cbdc-wt7jq\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:56.922902 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.922767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8589cf1b-4575-44a0-92fb-65a3021168dd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xt5m8\" (UID: \"8589cf1b-4575-44a0-92fb-65a3021168dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 19:59:56.922902 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.922839 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvx5\" (UniqueName: \"kubernetes.io/projected/8589cf1b-4575-44a0-92fb-65a3021168dd-kube-api-access-dfvx5\") pod \"llmisvc-controller-manager-68cc5db7c4-xt5m8\" (UID: \"8589cf1b-4575-44a0-92fb-65a3021168dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 19:59:56.922902 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:56.922878 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crq4w\" (UniqueName: \"kubernetes.io/projected/a362a023-f232-465c-834a-95b238b1e655-kube-api-access-crq4w\") pod \"kserve-controller-manager-659c8cbdc-wt7jq\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:57.023998 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.023970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8589cf1b-4575-44a0-92fb-65a3021168dd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xt5m8\" (UID: \"8589cf1b-4575-44a0-92fb-65a3021168dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 19:59:57.024190 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.024023 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvx5\" (UniqueName: \"kubernetes.io/projected/8589cf1b-4575-44a0-92fb-65a3021168dd-kube-api-access-dfvx5\") pod \"llmisvc-controller-manager-68cc5db7c4-xt5m8\" (UID: \"8589cf1b-4575-44a0-92fb-65a3021168dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 19:59:57.024190 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.024069 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crq4w\" (UniqueName: \"kubernetes.io/projected/a362a023-f232-465c-834a-95b238b1e655-kube-api-access-crq4w\") pod \"kserve-controller-manager-659c8cbdc-wt7jq\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:57.024190 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.024111 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert\") pod \"kserve-controller-manager-659c8cbdc-wt7jq\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:57.028196 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.028174 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 19:59:57.028351 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.028296 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 19:59:57.034593 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:59:57.034574 2570 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 19:59:57.034696 ip-10-0-129-34 kubenswrapper[2570]: E0416 19:59:57.034649 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert podName:a362a023-f232-465c-834a-95b238b1e655 nodeName:}" failed. No retries permitted until 2026-04-16 19:59:57.534625601 +0000 UTC m=+361.069491621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert") pod "kserve-controller-manager-659c8cbdc-wt7jq" (UID: "a362a023-f232-465c-834a-95b238b1e655") : secret "kserve-webhook-server-cert" not found Apr 16 19:59:57.036995 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.036973 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8589cf1b-4575-44a0-92fb-65a3021168dd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xt5m8\" (UID: \"8589cf1b-4575-44a0-92fb-65a3021168dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 19:59:57.040545 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.040522 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 19:59:57.051587 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.051566 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 19:59:57.061668 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.061646 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crq4w\" (UniqueName: \"kubernetes.io/projected/a362a023-f232-465c-834a-95b238b1e655-kube-api-access-crq4w\") pod \"kserve-controller-manager-659c8cbdc-wt7jq\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:57.061775 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.061685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvx5\" (UniqueName: \"kubernetes.io/projected/8589cf1b-4575-44a0-92fb-65a3021168dd-kube-api-access-dfvx5\") pod \"llmisvc-controller-manager-68cc5db7c4-xt5m8\" (UID: \"8589cf1b-4575-44a0-92fb-65a3021168dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 19:59:57.157491 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.157415 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-dxp5x\"" Apr 16 19:59:57.164707 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.164684 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 19:59:57.286018 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.285987 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8"] Apr 16 19:59:57.288402 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:59:57.288378 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8589cf1b_4575_44a0_92fb_65a3021168dd.slice/crio-79dbb38c042c2dad46bd635e6898d31590558ea0dced5f41732b7b96ab542deb WatchSource:0}: Error finding container 79dbb38c042c2dad46bd635e6898d31590558ea0dced5f41732b7b96ab542deb: Status 404 returned error can't find the container with id 79dbb38c042c2dad46bd635e6898d31590558ea0dced5f41732b7b96ab542deb Apr 16 19:59:57.289634 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.289617 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:59:57.483570 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.483483 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" event={"ID":"8589cf1b-4575-44a0-92fb-65a3021168dd","Type":"ContainerStarted","Data":"79dbb38c042c2dad46bd635e6898d31590558ea0dced5f41732b7b96ab542deb"} Apr 16 19:59:57.629755 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.629720 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert\") pod \"kserve-controller-manager-659c8cbdc-wt7jq\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:57.632012 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.631988 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert\") pod \"kserve-controller-manager-659c8cbdc-wt7jq\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:57.746020 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.745938 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-drflm\"" Apr 16 19:59:57.753302 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.753273 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 19:59:57.898483 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:57.898449 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-wt7jq"] Apr 16 19:59:57.899516 ip-10-0-129-34 kubenswrapper[2570]: W0416 19:59:57.899484 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda362a023_f232_465c_834a_95b238b1e655.slice/crio-2e4a6d9bfaee968f0f6347f9a267fa0a2f393cf74f85bcf7524cfb34a89924ff WatchSource:0}: Error finding container 2e4a6d9bfaee968f0f6347f9a267fa0a2f393cf74f85bcf7524cfb34a89924ff: Status 404 returned error can't find the container with id 2e4a6d9bfaee968f0f6347f9a267fa0a2f393cf74f85bcf7524cfb34a89924ff Apr 16 19:59:58.490581 ip-10-0-129-34 kubenswrapper[2570]: I0416 19:59:58.490540 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" event={"ID":"a362a023-f232-465c-834a-95b238b1e655","Type":"ContainerStarted","Data":"2e4a6d9bfaee968f0f6347f9a267fa0a2f393cf74f85bcf7524cfb34a89924ff"} Apr 16 20:00:01.502522 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:01.502483 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" event={"ID":"8589cf1b-4575-44a0-92fb-65a3021168dd","Type":"ContainerStarted","Data":"4f27334ca4ccf378e760c941b7ea0447f9e562ab6826fd4191fbb1bd2893a9f7"} Apr 16 20:00:01.502978 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:01.502678 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 20:00:01.503951 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:01.503929 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" event={"ID":"a362a023-f232-465c-834a-95b238b1e655","Type":"ContainerStarted","Data":"ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707"} Apr 16 20:00:01.504095 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:01.504079 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 20:00:01.520639 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:01.520581 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" podStartSLOduration=2.148211069 podStartE2EDuration="5.520562102s" podCreationTimestamp="2026-04-16 19:59:56 +0000 UTC" firstStartedPulling="2026-04-16 19:59:57.289743337 +0000 UTC m=+360.824609350" lastFinishedPulling="2026-04-16 20:00:00.662094368 +0000 UTC m=+364.196960383" observedRunningTime="2026-04-16 20:00:01.518956514 +0000 UTC m=+365.053822563" watchObservedRunningTime="2026-04-16 20:00:01.520562102 +0000 UTC m=+365.055428138" Apr 16 20:00:01.535207 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:01.535150 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" podStartSLOduration=2.7244986989999997 podStartE2EDuration="5.535129515s" podCreationTimestamp="2026-04-16 19:59:56 +0000 UTC" firstStartedPulling="2026-04-16 19:59:57.900947535 +0000 UTC m=+361.435813552" lastFinishedPulling="2026-04-16 20:00:00.711578353 +0000 UTC m=+364.246444368" observedRunningTime="2026-04-16 20:00:01.534498491 +0000 UTC m=+365.069364529" watchObservedRunningTime="2026-04-16 20:00:01.535129515 +0000 UTC m=+365.069995551" Apr 16 20:00:32.510262 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:32.510225 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xt5m8" Apr 16 20:00:32.513254 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:32.513232 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 20:00:33.800125 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:33.800089 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-wt7jq"] Apr 16 20:00:33.800551 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:33.800327 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" podUID="a362a023-f232-465c-834a-95b238b1e655" containerName="manager" containerID="cri-o://ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707" gracePeriod=10 Apr 16 20:00:34.040376 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.040352 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 20:00:34.152782 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.152688 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crq4w\" (UniqueName: \"kubernetes.io/projected/a362a023-f232-465c-834a-95b238b1e655-kube-api-access-crq4w\") pod \"a362a023-f232-465c-834a-95b238b1e655\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " Apr 16 20:00:34.152782 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.152763 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert\") pod \"a362a023-f232-465c-834a-95b238b1e655\" (UID: \"a362a023-f232-465c-834a-95b238b1e655\") " Apr 16 20:00:34.154980 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.154943 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a362a023-f232-465c-834a-95b238b1e655-kube-api-access-crq4w" (OuterVolumeSpecName: "kube-api-access-crq4w") pod "a362a023-f232-465c-834a-95b238b1e655" (UID: "a362a023-f232-465c-834a-95b238b1e655"). InnerVolumeSpecName "kube-api-access-crq4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:00:34.154980 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.154972 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert" (OuterVolumeSpecName: "cert") pod "a362a023-f232-465c-834a-95b238b1e655" (UID: "a362a023-f232-465c-834a-95b238b1e655"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:00:34.254114 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.254048 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crq4w\" (UniqueName: \"kubernetes.io/projected/a362a023-f232-465c-834a-95b238b1e655-kube-api-access-crq4w\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:00:34.254114 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.254105 2570 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a362a023-f232-465c-834a-95b238b1e655-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:00:34.616822 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.616785 2570 generic.go:358] "Generic (PLEG): container finished" podID="a362a023-f232-465c-834a-95b238b1e655" containerID="ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707" exitCode=0 Apr 16 20:00:34.616994 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.616846 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" Apr 16 20:00:34.616994 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.616868 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" event={"ID":"a362a023-f232-465c-834a-95b238b1e655","Type":"ContainerDied","Data":"ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707"} Apr 16 20:00:34.616994 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.616901 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-wt7jq" event={"ID":"a362a023-f232-465c-834a-95b238b1e655","Type":"ContainerDied","Data":"2e4a6d9bfaee968f0f6347f9a267fa0a2f393cf74f85bcf7524cfb34a89924ff"} Apr 16 20:00:34.616994 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.616916 2570 scope.go:117] "RemoveContainer" containerID="ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707" Apr 16 20:00:34.625532 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.625516 2570 scope.go:117] "RemoveContainer" containerID="ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707" Apr 16 20:00:34.625842 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:00:34.625821 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707\": container with ID starting with ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707 not found: ID does not exist" containerID="ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707" Apr 16 20:00:34.625909 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.625855 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707"} err="failed to get container status \"ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707\": rpc error: code = NotFound desc = could not find container \"ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707\": container with ID starting with ee9c32225f4940863d5737804a3944c5b3f5e7ec2759063afcbf8d8597607707 not found: ID does not exist" Apr 16 20:00:34.645272 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.645238 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-wt7jq"] Apr 16 20:00:34.650238 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:34.650206 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-wt7jq"] Apr 16 20:00:35.111591 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:00:35.111558 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a362a023-f232-465c-834a-95b238b1e655" path="/var/lib/kubelet/pods/a362a023-f232-465c-834a-95b238b1e655/volumes" Apr 16 20:01:09.056513 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.056478 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-42lzj"] Apr 16 20:01:09.056987 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.056814 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a362a023-f232-465c-834a-95b238b1e655" containerName="manager" Apr 16 20:01:09.056987 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.056825 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a362a023-f232-465c-834a-95b238b1e655" containerName="manager" Apr 16 20:01:09.056987 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.056894 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a362a023-f232-465c-834a-95b238b1e655" containerName="manager" Apr 16 20:01:09.061021 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.060999 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.063743 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.063722 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 20:01:09.063904 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.063728 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-z8gwp\"" Apr 16 20:01:09.066791 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.066760 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-d6hxk"] Apr 16 20:01:09.070129 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.070104 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:09.071973 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.071946 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-42lzj"] Apr 16 20:01:09.077898 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.077872 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:01:09.077898 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.077889 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-vtczt\"" Apr 16 20:01:09.084642 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.084610 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-d6hxk"] Apr 16 20:01:09.141696 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.141649 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt6bh\" (UniqueName: \"kubernetes.io/projected/cea8e105-f656-446f-9fe4-949adf51ce2a-kube-api-access-vt6bh\") pod \"odh-model-controller-696fc77849-d6hxk\" (UID: \"cea8e105-f656-446f-9fe4-949adf51ce2a\") " pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:09.141894 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.141707 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6434a9-5e23-42f8-920a-c035adf4f4c2-tls-certs\") pod \"model-serving-api-86f7b4b499-42lzj\" (UID: \"ac6434a9-5e23-42f8-920a-c035adf4f4c2\") " pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.141894 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.141791 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skd8g\" (UniqueName: \"kubernetes.io/projected/ac6434a9-5e23-42f8-920a-c035adf4f4c2-kube-api-access-skd8g\") pod \"model-serving-api-86f7b4b499-42lzj\" (UID: \"ac6434a9-5e23-42f8-920a-c035adf4f4c2\") " pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.141894 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.141830 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea8e105-f656-446f-9fe4-949adf51ce2a-cert\") pod \"odh-model-controller-696fc77849-d6hxk\" (UID: \"cea8e105-f656-446f-9fe4-949adf51ce2a\") " pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:09.242988 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.242928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt6bh\" (UniqueName: \"kubernetes.io/projected/cea8e105-f656-446f-9fe4-949adf51ce2a-kube-api-access-vt6bh\") pod \"odh-model-controller-696fc77849-d6hxk\" (UID: \"cea8e105-f656-446f-9fe4-949adf51ce2a\") " pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:09.243226 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.243100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6434a9-5e23-42f8-920a-c035adf4f4c2-tls-certs\") pod \"model-serving-api-86f7b4b499-42lzj\" (UID: \"ac6434a9-5e23-42f8-920a-c035adf4f4c2\") " pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.243226 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.243172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skd8g\" (UniqueName: \"kubernetes.io/projected/ac6434a9-5e23-42f8-920a-c035adf4f4c2-kube-api-access-skd8g\") pod \"model-serving-api-86f7b4b499-42lzj\" (UID: \"ac6434a9-5e23-42f8-920a-c035adf4f4c2\") " pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.243226 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.243218 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea8e105-f656-446f-9fe4-949adf51ce2a-cert\") pod \"odh-model-controller-696fc77849-d6hxk\" (UID: \"cea8e105-f656-446f-9fe4-949adf51ce2a\") " pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:09.243428 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:01:09.243243 2570 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 20:01:09.243428 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:01:09.243313 2570 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 20:01:09.243428 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:01:09.243334 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6434a9-5e23-42f8-920a-c035adf4f4c2-tls-certs podName:ac6434a9-5e23-42f8-920a-c035adf4f4c2 nodeName:}" failed. No retries permitted until 2026-04-16 20:01:09.743291632 +0000 UTC m=+433.278157653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/ac6434a9-5e23-42f8-920a-c035adf4f4c2-tls-certs") pod "model-serving-api-86f7b4b499-42lzj" (UID: "ac6434a9-5e23-42f8-920a-c035adf4f4c2") : secret "model-serving-api-tls" not found Apr 16 20:01:09.243428 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:01:09.243361 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cea8e105-f656-446f-9fe4-949adf51ce2a-cert podName:cea8e105-f656-446f-9fe4-949adf51ce2a nodeName:}" failed. No retries permitted until 2026-04-16 20:01:09.743346494 +0000 UTC m=+433.278212508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cea8e105-f656-446f-9fe4-949adf51ce2a-cert") pod "odh-model-controller-696fc77849-d6hxk" (UID: "cea8e105-f656-446f-9fe4-949adf51ce2a") : secret "odh-model-controller-webhook-cert" not found Apr 16 20:01:09.255551 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.255519 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skd8g\" (UniqueName: \"kubernetes.io/projected/ac6434a9-5e23-42f8-920a-c035adf4f4c2-kube-api-access-skd8g\") pod \"model-serving-api-86f7b4b499-42lzj\" (UID: \"ac6434a9-5e23-42f8-920a-c035adf4f4c2\") " pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.255704 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.255633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt6bh\" (UniqueName: \"kubernetes.io/projected/cea8e105-f656-446f-9fe4-949adf51ce2a-kube-api-access-vt6bh\") pod \"odh-model-controller-696fc77849-d6hxk\" (UID: \"cea8e105-f656-446f-9fe4-949adf51ce2a\") " pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:09.748729 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.748692 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea8e105-f656-446f-9fe4-949adf51ce2a-cert\") pod \"odh-model-controller-696fc77849-d6hxk\" (UID: \"cea8e105-f656-446f-9fe4-949adf51ce2a\") " pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:09.748912 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.748757 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6434a9-5e23-42f8-920a-c035adf4f4c2-tls-certs\") pod \"model-serving-api-86f7b4b499-42lzj\" (UID: \"ac6434a9-5e23-42f8-920a-c035adf4f4c2\") " pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.751179 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.751155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cea8e105-f656-446f-9fe4-949adf51ce2a-cert\") pod \"odh-model-controller-696fc77849-d6hxk\" (UID: \"cea8e105-f656-446f-9fe4-949adf51ce2a\") " pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:09.751314 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.751204 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6434a9-5e23-42f8-920a-c035adf4f4c2-tls-certs\") pod \"model-serving-api-86f7b4b499-42lzj\" (UID: \"ac6434a9-5e23-42f8-920a-c035adf4f4c2\") " pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.972908 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.972858 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:09.981775 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:09.981738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:10.113765 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:10.113737 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-42lzj"] Apr 16 20:01:10.115528 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:01:10.115500 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac6434a9_5e23_42f8_920a_c035adf4f4c2.slice/crio-7711c44f3e1b3dade54f188f567d617fab0913fb33f36ff06cb75b5f41734ac1 WatchSource:0}: Error finding container 7711c44f3e1b3dade54f188f567d617fab0913fb33f36ff06cb75b5f41734ac1: Status 404 returned error can't find the container with id 7711c44f3e1b3dade54f188f567d617fab0913fb33f36ff06cb75b5f41734ac1 Apr 16 20:01:10.128932 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:10.128910 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-d6hxk"] Apr 16 20:01:10.131300 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:01:10.131266 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea8e105_f656_446f_9fe4_949adf51ce2a.slice/crio-5bfc61da1139aa7b1f6f635f9101fcb248b90cfcd86b95e2a1b939f7b7ec955b WatchSource:0}: Error finding container 5bfc61da1139aa7b1f6f635f9101fcb248b90cfcd86b95e2a1b939f7b7ec955b: Status 404 returned error can't find the container with id 5bfc61da1139aa7b1f6f635f9101fcb248b90cfcd86b95e2a1b939f7b7ec955b Apr 16 20:01:10.734668 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:10.734630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-42lzj" event={"ID":"ac6434a9-5e23-42f8-920a-c035adf4f4c2","Type":"ContainerStarted","Data":"7711c44f3e1b3dade54f188f567d617fab0913fb33f36ff06cb75b5f41734ac1"} Apr 16 20:01:10.735968 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:10.735930 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-d6hxk" event={"ID":"cea8e105-f656-446f-9fe4-949adf51ce2a","Type":"ContainerStarted","Data":"5bfc61da1139aa7b1f6f635f9101fcb248b90cfcd86b95e2a1b939f7b7ec955b"} Apr 16 20:01:13.755018 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:13.754984 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-d6hxk" event={"ID":"cea8e105-f656-446f-9fe4-949adf51ce2a","Type":"ContainerStarted","Data":"820b768a07cc2b5ef3c96e1d711100586752974e2b424ee2bb2ec6dc33caf1e1"} Apr 16 20:01:13.755491 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:13.755095 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:13.756367 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:13.756347 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-42lzj" event={"ID":"ac6434a9-5e23-42f8-920a-c035adf4f4c2","Type":"ContainerStarted","Data":"2f34d47a5c64b6b10ae432914264954375f9acefdc42e22ab134d7e7e2f813c0"} Apr 16 20:01:13.756474 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:13.756462 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:13.775730 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:13.775672 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-d6hxk" podStartSLOduration=1.676520818 podStartE2EDuration="4.775658194s" podCreationTimestamp="2026-04-16 20:01:09 +0000 UTC" firstStartedPulling="2026-04-16 20:01:10.132574836 +0000 UTC m=+433.667440853" lastFinishedPulling="2026-04-16 20:01:13.231712204 +0000 UTC m=+436.766578229" observedRunningTime="2026-04-16 20:01:13.773901158 +0000 UTC m=+437.308767215" watchObservedRunningTime="2026-04-16 20:01:13.775658194 +0000 UTC m=+437.310524230" Apr 16 20:01:13.792258 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:13.792203 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-42lzj" podStartSLOduration=1.6813607990000001 podStartE2EDuration="4.792188948s" podCreationTimestamp="2026-04-16 20:01:09 +0000 UTC" firstStartedPulling="2026-04-16 20:01:10.118286299 +0000 UTC m=+433.653152328" lastFinishedPulling="2026-04-16 20:01:13.229114463 +0000 UTC m=+436.763980477" observedRunningTime="2026-04-16 20:01:13.791278238 +0000 UTC m=+437.326144274" watchObservedRunningTime="2026-04-16 20:01:13.792188948 +0000 UTC m=+437.327054984" Apr 16 20:01:24.762410 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:24.762337 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-d6hxk" Apr 16 20:01:24.764223 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:24.764203 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-42lzj" Apr 16 20:01:28.248108 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.248071 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68f588d468-xszx9"] Apr 16 20:01:28.251826 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.251803 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.264026 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.263998 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f588d468-xszx9"] Apr 16 20:01:28.420173 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.420126 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g9nd\" (UniqueName: \"kubernetes.io/projected/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-kube-api-access-5g9nd\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.420352 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.420194 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-oauth-serving-cert\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.420352 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.420267 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-oauth-config\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.420352 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.420312 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-config\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.420491 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.420355 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-trusted-ca-bundle\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.420491 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.420384 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-serving-cert\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.420491 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.420407 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-service-ca\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.521098 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.521041 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-oauth-config\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.521279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.521110 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-config\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.521279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.521135 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-trusted-ca-bundle\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.521279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.521153 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-serving-cert\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.521279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.521169 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-service-ca\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.521279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.521224 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g9nd\" (UniqueName: \"kubernetes.io/projected/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-kube-api-access-5g9nd\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.521279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.521251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-oauth-serving-cert\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.522036 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.522004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-service-ca\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.522172 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.522010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-oauth-serving-cert\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.522172 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.522149 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-config\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.522172 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.522158 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-trusted-ca-bundle\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.523644 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.523613 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-serving-cert\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.523763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.523743 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-console-oauth-config\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.530439 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.530415 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g9nd\" (UniqueName: \"kubernetes.io/projected/ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8-kube-api-access-5g9nd\") pod \"console-68f588d468-xszx9\" (UID: \"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8\") " pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.561945 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.561903 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:28.692975 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.692946 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f588d468-xszx9"] Apr 16 20:01:28.695685 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:01:28.695647 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef57cf43_51ab_4af0_aa56_6cc5a51fe4c8.slice/crio-b0fe7dd4e0e1daed2ab64a1e549f3a65117a8d7f00d4d3c271910e62427d24ef WatchSource:0}: Error finding container b0fe7dd4e0e1daed2ab64a1e549f3a65117a8d7f00d4d3c271910e62427d24ef: Status 404 returned error can't find the container with id b0fe7dd4e0e1daed2ab64a1e549f3a65117a8d7f00d4d3c271910e62427d24ef Apr 16 20:01:28.807670 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.807574 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f588d468-xszx9" event={"ID":"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8","Type":"ContainerStarted","Data":"a5cfce4fb93417fca1ac9e49511eaf67966e425150871b35f2cc0b3d4fa1a366"} Apr 16 20:01:28.807670 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.807614 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f588d468-xszx9" event={"ID":"ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8","Type":"ContainerStarted","Data":"b0fe7dd4e0e1daed2ab64a1e549f3a65117a8d7f00d4d3c271910e62427d24ef"} Apr 16 20:01:28.826948 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:28.826877 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68f588d468-xszx9" podStartSLOduration=0.826856796 podStartE2EDuration="826.856796ms" podCreationTimestamp="2026-04-16 20:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:01:28.82539342 +0000 UTC m=+452.360259447" watchObservedRunningTime="2026-04-16 20:01:28.826856796 +0000 UTC m=+452.361722833" Apr 16 20:01:37.013028 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.012987 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d"] Apr 16 20:01:37.016516 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.016497 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:37.019494 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.019470 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 20:01:37.019607 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.019588 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-n4npf\"" Apr 16 20:01:37.025414 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.025386 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d"] Apr 16 20:01:37.201607 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.201566 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-ctk4d\" (UID: \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:37.201782 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.201693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c54rq\" (UniqueName: \"kubernetes.io/projected/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-kube-api-access-c54rq\") pod \"seaweedfs-tls-custom-ddd4dbfd-ctk4d\" (UID: \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:37.302571 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.302474 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c54rq\" (UniqueName: \"kubernetes.io/projected/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-kube-api-access-c54rq\") pod \"seaweedfs-tls-custom-ddd4dbfd-ctk4d\" (UID: \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:37.302571 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.302548 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-ctk4d\" (UID: \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:37.302929 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.302908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-ctk4d\" (UID: \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:37.311470 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.311435 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c54rq\" (UniqueName: \"kubernetes.io/projected/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-kube-api-access-c54rq\") pod \"seaweedfs-tls-custom-ddd4dbfd-ctk4d\" (UID: \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:37.326872 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.326838 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:37.454253 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.454222 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d"] Apr 16 20:01:37.458038 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:01:37.457993 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5f84ea3_65c2_43b8_9718_7a2c0063e1ce.slice/crio-b1a85db601361158ca4976deb9a336dbd6ca70f561dfcf903b11dcece7b43a4d WatchSource:0}: Error finding container b1a85db601361158ca4976deb9a336dbd6ca70f561dfcf903b11dcece7b43a4d: Status 404 returned error can't find the container with id b1a85db601361158ca4976deb9a336dbd6ca70f561dfcf903b11dcece7b43a4d Apr 16 20:01:37.840404 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:37.840360 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" event={"ID":"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce","Type":"ContainerStarted","Data":"b1a85db601361158ca4976deb9a336dbd6ca70f561dfcf903b11dcece7b43a4d"} Apr 16 20:01:38.562566 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:38.562526 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:38.562997 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:38.562587 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:38.568204 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:38.568178 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:38.849878 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:38.849794 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68f588d468-xszx9" Apr 16 20:01:38.920171 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:38.920137 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7ddcc767f5-74npf"] Apr 16 20:01:40.852624 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:40.852583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" event={"ID":"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce","Type":"ContainerStarted","Data":"58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4"} Apr 16 20:01:40.870409 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:40.870356 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" podStartSLOduration=2.334264087 podStartE2EDuration="4.870339414s" podCreationTimestamp="2026-04-16 20:01:36 +0000 UTC" firstStartedPulling="2026-04-16 20:01:37.459808171 +0000 UTC m=+460.994674186" lastFinishedPulling="2026-04-16 20:01:39.995883496 +0000 UTC m=+463.530749513" observedRunningTime="2026-04-16 20:01:40.868566243 +0000 UTC m=+464.403432312" watchObservedRunningTime="2026-04-16 20:01:40.870339414 +0000 UTC m=+464.405205450" Apr 16 20:01:42.201261 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:42.201224 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d"] Apr 16 20:01:42.859968 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:42.859928 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" podUID="b5f84ea3-65c2-43b8-9718-7a2c0063e1ce" containerName="seaweedfs-tls-custom" containerID="cri-o://58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4" gracePeriod=30 Apr 16 20:01:44.102377 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.102351 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:44.256884 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.256851 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c54rq\" (UniqueName: \"kubernetes.io/projected/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-kube-api-access-c54rq\") pod \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\" (UID: \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\") " Apr 16 20:01:44.257041 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.256919 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-data\") pod \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\" (UID: \"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce\") " Apr 16 20:01:44.258237 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.258205 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-data" (OuterVolumeSpecName: "data") pod "b5f84ea3-65c2-43b8-9718-7a2c0063e1ce" (UID: "b5f84ea3-65c2-43b8-9718-7a2c0063e1ce"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:01:44.258872 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.258848 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-kube-api-access-c54rq" (OuterVolumeSpecName: "kube-api-access-c54rq") pod "b5f84ea3-65c2-43b8-9718-7a2c0063e1ce" (UID: "b5f84ea3-65c2-43b8-9718-7a2c0063e1ce"). InnerVolumeSpecName "kube-api-access-c54rq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:44.357594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.357553 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c54rq\" (UniqueName: \"kubernetes.io/projected/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-kube-api-access-c54rq\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:01:44.357594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.357585 2570 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce-data\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:01:44.867490 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.867452 2570 generic.go:358] "Generic (PLEG): container finished" podID="b5f84ea3-65c2-43b8-9718-7a2c0063e1ce" containerID="58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4" exitCode=0 Apr 16 20:01:44.867659 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.867518 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" Apr 16 20:01:44.867659 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.867536 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" event={"ID":"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce","Type":"ContainerDied","Data":"58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4"} Apr 16 20:01:44.867659 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.867574 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d" event={"ID":"b5f84ea3-65c2-43b8-9718-7a2c0063e1ce","Type":"ContainerDied","Data":"b1a85db601361158ca4976deb9a336dbd6ca70f561dfcf903b11dcece7b43a4d"} Apr 16 20:01:44.867659 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.867592 2570 scope.go:117] "RemoveContainer" containerID="58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4" Apr 16 20:01:44.876991 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.876964 2570 scope.go:117] "RemoveContainer" containerID="58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4" Apr 16 20:01:44.877303 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:01:44.877282 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4\": container with ID starting with 58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4 not found: ID does not exist" containerID="58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4" Apr 16 20:01:44.877372 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.877314 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4"} err="failed to get container status \"58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4\": rpc error: code = NotFound desc = could not find container \"58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4\": container with ID starting with 58776c3efea521d05b218e0d9ccbcfd6f9d9dc41e6c0895a87f6b4b3e8cdacc4 not found: ID does not exist" Apr 16 20:01:44.889889 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.889862 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d"] Apr 16 20:01:44.896770 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.896742 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ctk4d"] Apr 16 20:01:44.932089 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.932042 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z"] Apr 16 20:01:44.932461 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.932448 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5f84ea3-65c2-43b8-9718-7a2c0063e1ce" containerName="seaweedfs-tls-custom" Apr 16 20:01:44.932506 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.932464 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f84ea3-65c2-43b8-9718-7a2c0063e1ce" containerName="seaweedfs-tls-custom" Apr 16 20:01:44.932541 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.932529 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5f84ea3-65c2-43b8-9718-7a2c0063e1ce" containerName="seaweedfs-tls-custom" Apr 16 20:01:44.937078 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.936686 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:44.940222 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.940199 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 20:01:44.941310 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.940465 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-n4npf\"" Apr 16 20:01:44.941465 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.940483 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 20:01:44.942818 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.942795 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z"] Apr 16 20:01:44.962913 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.962872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/76018fe1-a7f6-440e-b2bd-f43e53652d09-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:44.963096 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.962930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q77jh\" (UniqueName: \"kubernetes.io/projected/76018fe1-a7f6-440e-b2bd-f43e53652d09-kube-api-access-q77jh\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:44.963096 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:44.962966 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/76018fe1-a7f6-440e-b2bd-f43e53652d09-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:45.064164 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.064117 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/76018fe1-a7f6-440e-b2bd-f43e53652d09-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:45.064365 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.064198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q77jh\" (UniqueName: \"kubernetes.io/projected/76018fe1-a7f6-440e-b2bd-f43e53652d09-kube-api-access-q77jh\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:45.064365 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.064227 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/76018fe1-a7f6-440e-b2bd-f43e53652d09-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:45.064694 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.064671 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/76018fe1-a7f6-440e-b2bd-f43e53652d09-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:45.066604 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.066585 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/76018fe1-a7f6-440e-b2bd-f43e53652d09-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:45.073805 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.073776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q77jh\" (UniqueName: \"kubernetes.io/projected/76018fe1-a7f6-440e-b2bd-f43e53652d09-kube-api-access-q77jh\") pod \"seaweedfs-tls-custom-5c88b85bb7-bxf6z\" (UID: \"76018fe1-a7f6-440e-b2bd-f43e53652d09\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:45.112246 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.112212 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f84ea3-65c2-43b8-9718-7a2c0063e1ce" path="/var/lib/kubelet/pods/b5f84ea3-65c2-43b8-9718-7a2c0063e1ce/volumes" Apr 16 20:01:45.248923 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.248837 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" Apr 16 20:01:45.376485 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.376458 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z"] Apr 16 20:01:45.379162 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:01:45.379117 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76018fe1_a7f6_440e_b2bd_f43e53652d09.slice/crio-778af8ba388b4533a1f147b1d17dec8fddd39cdf7162ef6170c7082e16fbc76f WatchSource:0}: Error finding container 778af8ba388b4533a1f147b1d17dec8fddd39cdf7162ef6170c7082e16fbc76f: Status 404 returned error can't find the container with id 778af8ba388b4533a1f147b1d17dec8fddd39cdf7162ef6170c7082e16fbc76f Apr 16 20:01:45.873111 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.873075 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" event={"ID":"76018fe1-a7f6-440e-b2bd-f43e53652d09","Type":"ContainerStarted","Data":"a86653ad63afd8049cc99ad6160f7cec5036723d13870ed5b16988099e9e5fe1"} Apr 16 20:01:45.873111 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.873117 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" event={"ID":"76018fe1-a7f6-440e-b2bd-f43e53652d09","Type":"ContainerStarted","Data":"778af8ba388b4533a1f147b1d17dec8fddd39cdf7162ef6170c7082e16fbc76f"} Apr 16 20:01:45.891778 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:01:45.891683 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-bxf6z" podStartSLOduration=1.6443733759999999 podStartE2EDuration="1.891666489s" podCreationTimestamp="2026-04-16 20:01:44 +0000 UTC" firstStartedPulling="2026-04-16 20:01:45.380332769 +0000 UTC m=+468.915198783" lastFinishedPulling="2026-04-16 20:01:45.627625882 +0000 UTC m=+469.162491896" observedRunningTime="2026-04-16 20:01:45.88964733 +0000 UTC m=+469.424513366" watchObservedRunningTime="2026-04-16 20:01:45.891666489 +0000 UTC m=+469.426532524" Apr 16 20:02:03.942904 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:03.942845 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7ddcc767f5-74npf" podUID="7a9895a8-edbc-434f-9da4-900c567a1fa1" containerName="console" containerID="cri-o://c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb" gracePeriod=15 Apr 16 20:02:04.185185 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.185161 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7ddcc767f5-74npf_7a9895a8-edbc-434f-9da4-900c567a1fa1/console/0.log" Apr 16 20:02:04.185321 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.185225 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 20:02:04.206877 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.206789 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-trusted-ca-bundle\") pod \"7a9895a8-edbc-434f-9da4-900c567a1fa1\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " Apr 16 20:02:04.206877 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.206861 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-service-ca\") pod \"7a9895a8-edbc-434f-9da4-900c567a1fa1\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " Apr 16 20:02:04.207141 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.206907 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-oauth-config\") pod \"7a9895a8-edbc-434f-9da4-900c567a1fa1\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " Apr 16 20:02:04.207141 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.206928 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-serving-cert\") pod \"7a9895a8-edbc-434f-9da4-900c567a1fa1\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " Apr 16 20:02:04.207141 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.206963 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-556z2\" (UniqueName: \"kubernetes.io/projected/7a9895a8-edbc-434f-9da4-900c567a1fa1-kube-api-access-556z2\") pod \"7a9895a8-edbc-434f-9da4-900c567a1fa1\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " Apr 16 20:02:04.207141 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.207006 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-config\") pod \"7a9895a8-edbc-434f-9da4-900c567a1fa1\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " Apr 16 20:02:04.207141 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.207032 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-oauth-serving-cert\") pod \"7a9895a8-edbc-434f-9da4-900c567a1fa1\" (UID: \"7a9895a8-edbc-434f-9da4-900c567a1fa1\") " Apr 16 20:02:04.207455 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.207171 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7a9895a8-edbc-434f-9da4-900c567a1fa1" (UID: "7a9895a8-edbc-434f-9da4-900c567a1fa1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:04.207455 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.207288 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-service-ca" (OuterVolumeSpecName: "service-ca") pod "7a9895a8-edbc-434f-9da4-900c567a1fa1" (UID: "7a9895a8-edbc-434f-9da4-900c567a1fa1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:04.207455 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.207339 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-trusted-ca-bundle\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:02:04.207920 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.207616 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7a9895a8-edbc-434f-9da4-900c567a1fa1" (UID: "7a9895a8-edbc-434f-9da4-900c567a1fa1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:04.207920 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.207641 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-config" (OuterVolumeSpecName: "console-config") pod "7a9895a8-edbc-434f-9da4-900c567a1fa1" (UID: "7a9895a8-edbc-434f-9da4-900c567a1fa1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:04.209368 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.209333 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9895a8-edbc-434f-9da4-900c567a1fa1-kube-api-access-556z2" (OuterVolumeSpecName: "kube-api-access-556z2") pod "7a9895a8-edbc-434f-9da4-900c567a1fa1" (UID: "7a9895a8-edbc-434f-9da4-900c567a1fa1"). InnerVolumeSpecName "kube-api-access-556z2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:02:04.209778 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.209754 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7a9895a8-edbc-434f-9da4-900c567a1fa1" (UID: "7a9895a8-edbc-434f-9da4-900c567a1fa1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:02:04.209937 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.209917 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7a9895a8-edbc-434f-9da4-900c567a1fa1" (UID: "7a9895a8-edbc-434f-9da4-900c567a1fa1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:02:04.308477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.308439 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:02:04.308477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.308471 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-oauth-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:02:04.308477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.308482 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a9895a8-edbc-434f-9da4-900c567a1fa1-service-ca\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:02:04.308715 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.308491 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-oauth-config\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:02:04.308715 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.308500 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9895a8-edbc-434f-9da4-900c567a1fa1-console-serving-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:02:04.308715 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.308509 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-556z2\" (UniqueName: \"kubernetes.io/projected/7a9895a8-edbc-434f-9da4-900c567a1fa1-kube-api-access-556z2\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:02:04.943229 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.943201 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7ddcc767f5-74npf_7a9895a8-edbc-434f-9da4-900c567a1fa1/console/0.log" Apr 16 20:02:04.943634 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.943240 2570 generic.go:358] "Generic (PLEG): container finished" podID="7a9895a8-edbc-434f-9da4-900c567a1fa1" containerID="c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb" exitCode=2 Apr 16 20:02:04.943634 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.943272 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ddcc767f5-74npf" event={"ID":"7a9895a8-edbc-434f-9da4-900c567a1fa1","Type":"ContainerDied","Data":"c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb"} Apr 16 20:02:04.943634 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.943318 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ddcc767f5-74npf" event={"ID":"7a9895a8-edbc-434f-9da4-900c567a1fa1","Type":"ContainerDied","Data":"01ef11ab0b6ef29f57c437252a4e77ee08deda455c5f858b3c010839fa6904b5"} Apr 16 20:02:04.943634 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.943322 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ddcc767f5-74npf" Apr 16 20:02:04.943634 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.943335 2570 scope.go:117] "RemoveContainer" containerID="c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb" Apr 16 20:02:04.951737 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.951711 2570 scope.go:117] "RemoveContainer" containerID="c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb" Apr 16 20:02:04.952010 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:02:04.951989 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb\": container with ID starting with c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb not found: ID does not exist" containerID="c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb" Apr 16 20:02:04.952149 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.952017 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb"} err="failed to get container status \"c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb\": rpc error: code = NotFound desc = could not find container \"c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb\": container with ID starting with c34989cc3fd97f87c1b9be4629a6f579768709e14f7d671ffd044aa5c28dd0cb not found: ID does not exist" Apr 16 20:02:04.968887 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.968853 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7ddcc767f5-74npf"] Apr 16 20:02:04.978761 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:04.978734 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7ddcc767f5-74npf"] Apr 16 20:02:05.112847 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:05.112810 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9895a8-edbc-434f-9da4-900c567a1fa1" path="/var/lib/kubelet/pods/7a9895a8-edbc-434f-9da4-900c567a1fa1/volumes" Apr 16 20:02:11.867301 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:11.867266 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b"] Apr 16 20:02:11.867910 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:11.867634 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a9895a8-edbc-434f-9da4-900c567a1fa1" containerName="console" Apr 16 20:02:11.867910 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:11.867648 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9895a8-edbc-434f-9da4-900c567a1fa1" containerName="console" Apr 16 20:02:11.867910 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:11.867722 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a9895a8-edbc-434f-9da4-900c567a1fa1" containerName="console" Apr 16 20:02:11.872424 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:11.872406 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:02:11.874946 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:11.874917 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t49kf\"" Apr 16 20:02:11.878727 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:11.878705 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b"] Apr 16 20:02:11.976719 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:11.976686 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221ac944-6650-459b-8c00-c3354de5cf19-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b\" (UID: \"221ac944-6650-459b-8c00-c3354de5cf19\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:02:12.077425 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:12.077388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221ac944-6650-459b-8c00-c3354de5cf19-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b\" (UID: \"221ac944-6650-459b-8c00-c3354de5cf19\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:02:12.077772 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:12.077753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221ac944-6650-459b-8c00-c3354de5cf19-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b\" (UID: \"221ac944-6650-459b-8c00-c3354de5cf19\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:02:12.182976 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:12.182889 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:02:12.311447 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:12.311414 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b"] Apr 16 20:02:12.313359 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:02:12.313330 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221ac944_6650_459b_8c00_c3354de5cf19.slice/crio-93186fd0193e467caf6a7789f55459a7b8a4601b668163c4c4be9e5e8f9a966f WatchSource:0}: Error finding container 93186fd0193e467caf6a7789f55459a7b8a4601b668163c4c4be9e5e8f9a966f: Status 404 returned error can't find the container with id 93186fd0193e467caf6a7789f55459a7b8a4601b668163c4c4be9e5e8f9a966f Apr 16 20:02:12.978610 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:12.978568 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" event={"ID":"221ac944-6650-459b-8c00-c3354de5cf19","Type":"ContainerStarted","Data":"93186fd0193e467caf6a7789f55459a7b8a4601b668163c4c4be9e5e8f9a966f"} Apr 16 20:02:15.990489 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:15.990386 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" event={"ID":"221ac944-6650-459b-8c00-c3354de5cf19","Type":"ContainerStarted","Data":"4a3c29aa6f1d159835fe056ef82ca3394eed672442b5a0cfa43dce79256b4e19"} Apr 16 20:02:20.005705 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:20.005665 2570 generic.go:358] "Generic (PLEG): container finished" podID="221ac944-6650-459b-8c00-c3354de5cf19" containerID="4a3c29aa6f1d159835fe056ef82ca3394eed672442b5a0cfa43dce79256b4e19" exitCode=0 Apr 16 20:02:20.006100 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:20.005741 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" event={"ID":"221ac944-6650-459b-8c00-c3354de5cf19","Type":"ContainerDied","Data":"4a3c29aa6f1d159835fe056ef82ca3394eed672442b5a0cfa43dce79256b4e19"} Apr 16 20:02:33.060623 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:33.060583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" event={"ID":"221ac944-6650-459b-8c00-c3354de5cf19","Type":"ContainerStarted","Data":"2fab56ab871d7244b26607c9c6ea043df1066cb13e111f2327bd4abba43a5844"} Apr 16 20:02:36.074482 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:36.074444 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" event={"ID":"221ac944-6650-459b-8c00-c3354de5cf19","Type":"ContainerStarted","Data":"cd17fea3eedae411694d99391c7e8ad97a909bcd961be5baf4405144446363b0"} Apr 16 20:02:36.074898 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:36.074726 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:02:36.076240 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:36.076190 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:02:36.092760 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:36.092705 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podStartSLOduration=1.487630284 podStartE2EDuration="25.092688673s" podCreationTimestamp="2026-04-16 20:02:11 +0000 UTC" firstStartedPulling="2026-04-16 20:02:12.31533414 +0000 UTC m=+495.850200153" lastFinishedPulling="2026-04-16 20:02:35.920392528 +0000 UTC m=+519.455258542" observedRunningTime="2026-04-16 20:02:36.090322057 +0000 UTC m=+519.625188092" watchObservedRunningTime="2026-04-16 20:02:36.092688673 +0000 UTC m=+519.627554708" Apr 16 20:02:37.077926 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:37.077877 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:02:37.078392 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:37.078042 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:02:37.079020 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:37.078988 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:38.081835 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:38.081791 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:02:38.082316 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:38.082204 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:48.082100 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:48.081975 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:02:48.082634 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:48.082499 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:58.082201 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:58.082154 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:02:58.082642 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:02:58.082596 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:08.082007 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:08.081945 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:03:08.082484 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:08.082364 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:18.081992 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:18.081942 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:03:18.082444 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:18.082349 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:28.082253 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:28.082204 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:03:28.082738 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:28.082644 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:38.082247 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:38.082198 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:03:38.082819 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:38.082785 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:48.082330 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:48.082291 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:03:48.082811 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:48.082383 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:03:57.018955 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.018921 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b"] Apr 16 20:03:57.019740 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.019709 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" containerID="cri-o://2fab56ab871d7244b26607c9c6ea043df1066cb13e111f2327bd4abba43a5844" gracePeriod=30 Apr 16 20:03:57.019963 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.019739 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" containerID="cri-o://cd17fea3eedae411694d99391c7e8ad97a909bcd961be5baf4405144446363b0" gracePeriod=30 Apr 16 20:03:57.123760 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.123727 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8"] Apr 16 20:03:57.127643 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.127620 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:03:57.139426 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.139397 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8"] Apr 16 20:03:57.224407 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.224366 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6f849bc-8b50-4f05-ad7a-d7d01893d8d3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8\" (UID: \"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:03:57.325451 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.325348 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6f849bc-8b50-4f05-ad7a-d7d01893d8d3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8\" (UID: \"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:03:57.325780 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.325758 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6f849bc-8b50-4f05-ad7a-d7d01893d8d3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8\" (UID: \"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:03:57.441162 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.441125 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:03:57.570358 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:57.570284 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8"] Apr 16 20:03:57.572679 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:03:57.572644 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6f849bc_8b50_4f05_ad7a_d7d01893d8d3.slice/crio-c43c3ad643146aedace5ae2b407b344d73caaf85a0095d7880fc0775c663ffeb WatchSource:0}: Error finding container c43c3ad643146aedace5ae2b407b344d73caaf85a0095d7880fc0775c663ffeb: Status 404 returned error can't find the container with id c43c3ad643146aedace5ae2b407b344d73caaf85a0095d7880fc0775c663ffeb Apr 16 20:03:58.082271 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:58.082218 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:03:58.082732 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:58.082699 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:58.360932 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:58.360844 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" event={"ID":"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3","Type":"ContainerStarted","Data":"259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4"} Apr 16 20:03:58.360932 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:03:58.360883 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" event={"ID":"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3","Type":"ContainerStarted","Data":"c43c3ad643146aedace5ae2b407b344d73caaf85a0095d7880fc0775c663ffeb"} Apr 16 20:04:02.375856 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:02.375823 2570 generic.go:358] "Generic (PLEG): container finished" podID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerID="259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4" exitCode=0 Apr 16 20:04:02.376356 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:02.375900 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" event={"ID":"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3","Type":"ContainerDied","Data":"259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4"} Apr 16 20:04:02.377938 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:02.377913 2570 generic.go:358] "Generic (PLEG): container finished" podID="221ac944-6650-459b-8c00-c3354de5cf19" containerID="2fab56ab871d7244b26607c9c6ea043df1066cb13e111f2327bd4abba43a5844" exitCode=0 Apr 16 20:04:02.378040 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:02.377975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" event={"ID":"221ac944-6650-459b-8c00-c3354de5cf19","Type":"ContainerDied","Data":"2fab56ab871d7244b26607c9c6ea043df1066cb13e111f2327bd4abba43a5844"} Apr 16 20:04:03.384105 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:03.384046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" event={"ID":"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3","Type":"ContainerStarted","Data":"d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659"} Apr 16 20:04:03.384105 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:03.384110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" event={"ID":"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3","Type":"ContainerStarted","Data":"c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11"} Apr 16 20:04:03.384586 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:03.384503 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:04:03.384586 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:03.384531 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:04:03.385913 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:03.385885 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:04:03.386554 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:03.386526 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:03.401289 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:03.401239 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podStartSLOduration=6.401226817 podStartE2EDuration="6.401226817s" podCreationTimestamp="2026-04-16 20:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:04:03.399768353 +0000 UTC m=+606.934634400" watchObservedRunningTime="2026-04-16 20:04:03.401226817 +0000 UTC m=+606.936092914" Apr 16 20:04:04.387907 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:04.387855 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:04:04.388360 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:04.388250 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:08.082155 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:08.082100 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:04:08.082583 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:08.082409 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:14.388230 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:14.388179 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:04:14.388660 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:14.388630 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:18.081868 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:18.081763 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:04:18.082312 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:18.081939 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:04:18.082312 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:18.082102 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:18.082312 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:18.082221 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:04:24.388820 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:24.388772 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:04:24.389266 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:24.389215 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:27.478503 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:27.478418 2570 generic.go:358] "Generic (PLEG): container finished" podID="221ac944-6650-459b-8c00-c3354de5cf19" containerID="cd17fea3eedae411694d99391c7e8ad97a909bcd961be5baf4405144446363b0" exitCode=0 Apr 16 20:04:27.478503 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:27.478468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" event={"ID":"221ac944-6650-459b-8c00-c3354de5cf19","Type":"ContainerDied","Data":"cd17fea3eedae411694d99391c7e8ad97a909bcd961be5baf4405144446363b0"} Apr 16 20:04:27.672949 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:27.672922 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:04:27.694553 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:27.694518 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221ac944-6650-459b-8c00-c3354de5cf19-kserve-provision-location\") pod \"221ac944-6650-459b-8c00-c3354de5cf19\" (UID: \"221ac944-6650-459b-8c00-c3354de5cf19\") " Apr 16 20:04:27.694873 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:27.694844 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221ac944-6650-459b-8c00-c3354de5cf19-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "221ac944-6650-459b-8c00-c3354de5cf19" (UID: "221ac944-6650-459b-8c00-c3354de5cf19"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:04:27.795284 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:27.795245 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221ac944-6650-459b-8c00-c3354de5cf19-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:04:28.483753 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:28.483717 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" Apr 16 20:04:28.484213 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:28.483721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b" event={"ID":"221ac944-6650-459b-8c00-c3354de5cf19","Type":"ContainerDied","Data":"93186fd0193e467caf6a7789f55459a7b8a4601b668163c4c4be9e5e8f9a966f"} Apr 16 20:04:28.484213 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:28.483848 2570 scope.go:117] "RemoveContainer" containerID="cd17fea3eedae411694d99391c7e8ad97a909bcd961be5baf4405144446363b0" Apr 16 20:04:28.492754 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:28.492733 2570 scope.go:117] "RemoveContainer" containerID="2fab56ab871d7244b26607c9c6ea043df1066cb13e111f2327bd4abba43a5844" Apr 16 20:04:28.500114 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:28.500092 2570 scope.go:117] "RemoveContainer" containerID="4a3c29aa6f1d159835fe056ef82ca3394eed672442b5a0cfa43dce79256b4e19" Apr 16 20:04:28.507891 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:28.507871 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b"] Apr 16 20:04:28.514626 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:28.514601 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-tvb9b"] Apr 16 20:04:29.111982 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:29.111939 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221ac944-6650-459b-8c00-c3354de5cf19" path="/var/lib/kubelet/pods/221ac944-6650-459b-8c00-c3354de5cf19/volumes" Apr 16 20:04:34.388753 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:34.388694 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:04:34.389273 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:34.389248 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:44.388170 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:44.388116 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:04:44.388586 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:44.388498 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:54.388276 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:54.388223 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:04:54.388731 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:04:54.388706 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:05:04.387969 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:04.387907 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:05:04.388542 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:04.388420 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:05:12.107834 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:12.107798 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:05:12.108322 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:12.108134 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:05:22.222938 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:22.222894 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8"] Apr 16 20:05:22.223383 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:22.223354 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" containerID="cri-o://c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11" gracePeriod=30 Apr 16 20:05:22.223488 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:22.223433 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" containerID="cri-o://d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659" gracePeriod=30 Apr 16 20:05:26.681273 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:26.681239 2570 generic.go:358] "Generic (PLEG): container finished" podID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerID="c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11" exitCode=0 Apr 16 20:05:26.681656 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:26.681298 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" event={"ID":"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3","Type":"ContainerDied","Data":"c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11"} Apr 16 20:05:32.107536 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.107481 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:05:32.107967 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.107829 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:05:32.299297 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299264 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57"] Apr 16 20:05:32.299678 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299664 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" Apr 16 20:05:32.299724 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299681 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" Apr 16 20:05:32.299724 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299702 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="storage-initializer" Apr 16 20:05:32.299724 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299707 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="storage-initializer" Apr 16 20:05:32.299724 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299714 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" Apr 16 20:05:32.299724 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299721 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" Apr 16 20:05:32.299883 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299771 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="agent" Apr 16 20:05:32.299883 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.299783 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="221ac944-6650-459b-8c00-c3354de5cf19" containerName="kserve-container" Apr 16 20:05:32.302786 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.302769 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:05:32.310686 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.310656 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57"] Apr 16 20:05:32.358493 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.358394 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c26d767-e75f-43c1-afb3-c7a4f7c9018b-kserve-provision-location\") pod \"isvc-logger-predictor-69fc5c8d55-hff57\" (UID: \"9c26d767-e75f-43c1-afb3-c7a4f7c9018b\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:05:32.459401 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.459352 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c26d767-e75f-43c1-afb3-c7a4f7c9018b-kserve-provision-location\") pod \"isvc-logger-predictor-69fc5c8d55-hff57\" (UID: \"9c26d767-e75f-43c1-afb3-c7a4f7c9018b\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:05:32.459734 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.459715 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c26d767-e75f-43c1-afb3-c7a4f7c9018b-kserve-provision-location\") pod \"isvc-logger-predictor-69fc5c8d55-hff57\" (UID: \"9c26d767-e75f-43c1-afb3-c7a4f7c9018b\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:05:32.615668 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.615581 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:05:32.744349 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.744323 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57"] Apr 16 20:05:32.747341 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:05:32.747311 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c26d767_e75f_43c1_afb3_c7a4f7c9018b.slice/crio-995cb317899c0b4cf20f13e70d859f063e9a452ec0dea5d8fcea7409e1edb9f4 WatchSource:0}: Error finding container 995cb317899c0b4cf20f13e70d859f063e9a452ec0dea5d8fcea7409e1edb9f4: Status 404 returned error can't find the container with id 995cb317899c0b4cf20f13e70d859f063e9a452ec0dea5d8fcea7409e1edb9f4 Apr 16 20:05:32.749602 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:32.749583 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:05:33.712184 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:33.712147 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" event={"ID":"9c26d767-e75f-43c1-afb3-c7a4f7c9018b","Type":"ContainerStarted","Data":"bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8"} Apr 16 20:05:33.712184 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:33.712193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" event={"ID":"9c26d767-e75f-43c1-afb3-c7a4f7c9018b","Type":"ContainerStarted","Data":"995cb317899c0b4cf20f13e70d859f063e9a452ec0dea5d8fcea7409e1edb9f4"} Apr 16 20:05:36.723854 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:36.723812 2570 generic.go:358] "Generic (PLEG): container finished" podID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerID="bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8" exitCode=0 Apr 16 20:05:36.724290 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:36.723882 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" event={"ID":"9c26d767-e75f-43c1-afb3-c7a4f7c9018b","Type":"ContainerDied","Data":"bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8"} Apr 16 20:05:37.729505 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:37.729468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" event={"ID":"9c26d767-e75f-43c1-afb3-c7a4f7c9018b","Type":"ContainerStarted","Data":"8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e"} Apr 16 20:05:37.729921 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:37.729513 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" event={"ID":"9c26d767-e75f-43c1-afb3-c7a4f7c9018b","Type":"ContainerStarted","Data":"9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1"} Apr 16 20:05:37.729921 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:37.729785 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:05:37.729921 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:37.729816 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:05:37.731261 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:37.731227 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:05:37.732080 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:37.732038 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:05:37.748165 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:37.748116 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podStartSLOduration=5.748099859 podStartE2EDuration="5.748099859s" podCreationTimestamp="2026-04-16 20:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:05:37.746821198 +0000 UTC m=+701.281687233" watchObservedRunningTime="2026-04-16 20:05:37.748099859 +0000 UTC m=+701.282965894" Apr 16 20:05:38.733314 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:38.733275 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:05:38.733774 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:38.733677 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:05:42.107419 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:42.107373 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:05:42.107889 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:42.107728 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:05:48.733343 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:48.733241 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:05:48.733763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:48.733640 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:05:52.107671 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.107613 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:5000: connect: connection refused" Apr 16 20:05:52.108222 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.107755 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:05:52.108222 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.107976 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:05:52.108222 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.108159 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:05:52.378180 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.378155 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:05:52.439171 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.439133 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6f849bc-8b50-4f05-ad7a-d7d01893d8d3-kserve-provision-location\") pod \"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3\" (UID: \"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3\") " Apr 16 20:05:52.439513 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.439489 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f849bc-8b50-4f05-ad7a-d7d01893d8d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" (UID: "a6f849bc-8b50-4f05-ad7a-d7d01893d8d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:05:52.539938 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.539895 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6f849bc-8b50-4f05-ad7a-d7d01893d8d3-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:05:52.781791 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.781758 2570 generic.go:358] "Generic (PLEG): container finished" podID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerID="d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659" exitCode=0 Apr 16 20:05:52.781956 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.781833 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" event={"ID":"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3","Type":"ContainerDied","Data":"d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659"} Apr 16 20:05:52.781956 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.781850 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" Apr 16 20:05:52.781956 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.781862 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8" event={"ID":"a6f849bc-8b50-4f05-ad7a-d7d01893d8d3","Type":"ContainerDied","Data":"c43c3ad643146aedace5ae2b407b344d73caaf85a0095d7880fc0775c663ffeb"} Apr 16 20:05:52.781956 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.781879 2570 scope.go:117] "RemoveContainer" containerID="d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659" Apr 16 20:05:52.789842 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.789821 2570 scope.go:117] "RemoveContainer" containerID="c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11" Apr 16 20:05:52.796934 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.796917 2570 scope.go:117] "RemoveContainer" containerID="259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4" Apr 16 20:05:52.803309 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.803286 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8"] Apr 16 20:05:52.804542 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.804517 2570 scope.go:117] "RemoveContainer" containerID="d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659" Apr 16 20:05:52.804929 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:05:52.804902 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659\": container with ID starting with d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659 not found: ID does not exist" containerID="d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659" Apr 16 20:05:52.805105 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.804943 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659"} err="failed to get container status \"d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659\": rpc error: code = NotFound desc = could not find container \"d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659\": container with ID starting with d2285903c3d150dc2158c2ccd352ad81474c2c7bad4b9037d6dd960323e3c659 not found: ID does not exist" Apr 16 20:05:52.805105 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.804968 2570 scope.go:117] "RemoveContainer" containerID="c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11" Apr 16 20:05:52.805476 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:05:52.805437 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11\": container with ID starting with c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11 not found: ID does not exist" containerID="c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11" Apr 16 20:05:52.805585 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.805479 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11"} err="failed to get container status \"c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11\": rpc error: code = NotFound desc = could not find container \"c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11\": container with ID starting with c0b9961e9a894f98a76fdda7abcd708c75ff8b488dc86caec15882e74e0cee11 not found: ID does not exist" Apr 16 20:05:52.805585 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.805502 2570 scope.go:117] "RemoveContainer" containerID="259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4" Apr 16 20:05:52.805779 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:05:52.805760 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4\": container with ID starting with 259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4 not found: ID does not exist" containerID="259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4" Apr 16 20:05:52.805840 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.805785 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4"} err="failed to get container status \"259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4\": rpc error: code = NotFound desc = could not find container \"259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4\": container with ID starting with 259a7a09ddc241943a4ef3bc78684e6ca7cb13cb1c438bad862697dc1b1418e4 not found: ID does not exist" Apr 16 20:05:52.808066 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:52.808032 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-bm9l8"] Apr 16 20:05:53.111603 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:53.111523 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" path="/var/lib/kubelet/pods/a6f849bc-8b50-4f05-ad7a-d7d01893d8d3/volumes" Apr 16 20:05:58.734119 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:58.734068 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:05:58.734544 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:05:58.734519 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:08.733452 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:08.733402 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:06:08.734105 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:08.734033 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:18.734007 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:18.733959 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:06:18.734497 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:18.734381 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:28.734137 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:28.734086 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:06:28.734599 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:28.734550 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:38.733553 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:38.733492 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:06:38.733980 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:38.733923 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:48.734034 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:48.733997 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:06:48.734599 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:48.734296 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:06:57.540309 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.540269 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57"] Apr 16 20:06:57.540681 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.540564 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" containerID="cri-o://9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1" gracePeriod=30 Apr 16 20:06:57.540734 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.540652 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" containerID="cri-o://8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e" gracePeriod=30 Apr 16 20:06:57.564827 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.564792 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46"] Apr 16 20:06:57.565292 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.565275 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" Apr 16 20:06:57.565292 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.565294 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" Apr 16 20:06:57.565405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.565310 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" Apr 16 20:06:57.565405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.565315 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" Apr 16 20:06:57.565405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.565334 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="storage-initializer" Apr 16 20:06:57.565405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.565340 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="storage-initializer" Apr 16 20:06:57.565405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.565391 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="kserve-container" Apr 16 20:06:57.565405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.565403 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6f849bc-8b50-4f05-ad7a-d7d01893d8d3" containerName="agent" Apr 16 20:06:57.568552 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.568535 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:06:57.577077 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.577028 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46"] Apr 16 20:06:57.711872 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.711833 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867fb85c-f823-4fea-b334-e6a6da8481b1-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-klp46\" (UID: \"867fb85c-f823-4fea-b334-e6a6da8481b1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:06:57.813099 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.812968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867fb85c-f823-4fea-b334-e6a6da8481b1-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-klp46\" (UID: \"867fb85c-f823-4fea-b334-e6a6da8481b1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:06:57.813351 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.813320 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867fb85c-f823-4fea-b334-e6a6da8481b1-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-klp46\" (UID: \"867fb85c-f823-4fea-b334-e6a6da8481b1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:06:57.879838 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:57.879796 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:06:58.007968 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:58.007938 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46"] Apr 16 20:06:58.010522 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:06:58.010495 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod867fb85c_f823_4fea_b334_e6a6da8481b1.slice/crio-872d53fc8967749354b9d05b41e333ae0f0e010dbe516b1544c4baf3c3a15f65 WatchSource:0}: Error finding container 872d53fc8967749354b9d05b41e333ae0f0e010dbe516b1544c4baf3c3a15f65: Status 404 returned error can't find the container with id 872d53fc8967749354b9d05b41e333ae0f0e010dbe516b1544c4baf3c3a15f65 Apr 16 20:06:58.733820 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:58.733769 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:06:58.734546 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:58.734519 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:59.007717 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:59.007677 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" event={"ID":"867fb85c-f823-4fea-b334-e6a6da8481b1","Type":"ContainerStarted","Data":"9abae196f2cd0cebe3e192a45d20097b38b508f38dced1db2646536b54f16128"} Apr 16 20:06:59.007946 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:06:59.007722 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" event={"ID":"867fb85c-f823-4fea-b334-e6a6da8481b1","Type":"ContainerStarted","Data":"872d53fc8967749354b9d05b41e333ae0f0e010dbe516b1544c4baf3c3a15f65"} Apr 16 20:07:02.019005 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:02.018964 2570 generic.go:358] "Generic (PLEG): container finished" podID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerID="9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1" exitCode=0 Apr 16 20:07:02.019447 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:02.019042 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" event={"ID":"9c26d767-e75f-43c1-afb3-c7a4f7c9018b","Type":"ContainerDied","Data":"9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1"} Apr 16 20:07:02.020408 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:02.020386 2570 generic.go:358] "Generic (PLEG): container finished" podID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerID="9abae196f2cd0cebe3e192a45d20097b38b508f38dced1db2646536b54f16128" exitCode=0 Apr 16 20:07:02.020517 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:02.020462 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" event={"ID":"867fb85c-f823-4fea-b334-e6a6da8481b1","Type":"ContainerDied","Data":"9abae196f2cd0cebe3e192a45d20097b38b508f38dced1db2646536b54f16128"} Apr 16 20:07:08.733348 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:08.733274 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:07:08.733807 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:08.733581 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:07:10.056362 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:10.056327 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" event={"ID":"867fb85c-f823-4fea-b334-e6a6da8481b1","Type":"ContainerStarted","Data":"e7a8a550d75a0bde85bfc9254012811128668737b82745c20e5023b68f610586"} Apr 16 20:07:10.056821 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:10.056614 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:07:10.057814 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:10.057792 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:07:10.074909 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:10.074857 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podStartSLOduration=5.757137113 podStartE2EDuration="13.074842009s" podCreationTimestamp="2026-04-16 20:06:57 +0000 UTC" firstStartedPulling="2026-04-16 20:07:02.021685261 +0000 UTC m=+785.556551275" lastFinishedPulling="2026-04-16 20:07:09.339390157 +0000 UTC m=+792.874256171" observedRunningTime="2026-04-16 20:07:10.072803319 +0000 UTC m=+793.607669354" watchObservedRunningTime="2026-04-16 20:07:10.074842009 +0000 UTC m=+793.609708356" Apr 16 20:07:11.060535 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:11.060495 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:07:18.733936 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:18.733841 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:07:18.734397 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:18.733975 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:07:18.734397 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:18.734277 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:07:18.734397 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:18.734363 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:07:21.060914 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:21.060865 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:07:27.721791 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:27.721762 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:07:27.767636 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:27.767604 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c26d767-e75f-43c1-afb3-c7a4f7c9018b-kserve-provision-location\") pod \"9c26d767-e75f-43c1-afb3-c7a4f7c9018b\" (UID: \"9c26d767-e75f-43c1-afb3-c7a4f7c9018b\") " Apr 16 20:07:27.767988 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:27.767960 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c26d767-e75f-43c1-afb3-c7a4f7c9018b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9c26d767-e75f-43c1-afb3-c7a4f7c9018b" (UID: "9c26d767-e75f-43c1-afb3-c7a4f7c9018b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:07:27.868929 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:27.868841 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c26d767-e75f-43c1-afb3-c7a4f7c9018b-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:07:28.129591 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.129506 2570 generic.go:358] "Generic (PLEG): container finished" podID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerID="8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e" exitCode=137 Apr 16 20:07:28.129591 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.129546 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" event={"ID":"9c26d767-e75f-43c1-afb3-c7a4f7c9018b","Type":"ContainerDied","Data":"8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e"} Apr 16 20:07:28.129591 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.129568 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" event={"ID":"9c26d767-e75f-43c1-afb3-c7a4f7c9018b","Type":"ContainerDied","Data":"995cb317899c0b4cf20f13e70d859f063e9a452ec0dea5d8fcea7409e1edb9f4"} Apr 16 20:07:28.129591 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.129581 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57" Apr 16 20:07:28.129849 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.129584 2570 scope.go:117] "RemoveContainer" containerID="8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e" Apr 16 20:07:28.138543 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.138518 2570 scope.go:117] "RemoveContainer" containerID="9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1" Apr 16 20:07:28.145662 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.145643 2570 scope.go:117] "RemoveContainer" containerID="bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8" Apr 16 20:07:28.153131 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.153091 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57"] Apr 16 20:07:28.154526 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.154506 2570 scope.go:117] "RemoveContainer" containerID="8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e" Apr 16 20:07:28.154851 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:07:28.154824 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e\": container with ID starting with 8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e not found: ID does not exist" containerID="8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e" Apr 16 20:07:28.154929 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.154864 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e"} err="failed to get container status \"8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e\": rpc error: code = NotFound desc = could not find container \"8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e\": container with ID starting with 8203ed1eb8156f31b4471a126335ef45427e84f7541e5a63d7f9529c54075f7e not found: ID does not exist" Apr 16 20:07:28.154929 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.154893 2570 scope.go:117] "RemoveContainer" containerID="9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1" Apr 16 20:07:28.155186 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:07:28.155157 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1\": container with ID starting with 9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1 not found: ID does not exist" containerID="9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1" Apr 16 20:07:28.155291 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.155194 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1"} err="failed to get container status \"9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1\": rpc error: code = NotFound desc = could not find container \"9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1\": container with ID starting with 9d44d532dc3782d755d1e509152d27025604b002d6ded589f7526bf4a1bcf8a1 not found: ID does not exist" Apr 16 20:07:28.155291 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.155215 2570 scope.go:117] "RemoveContainer" containerID="bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8" Apr 16 20:07:28.155487 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:07:28.155468 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8\": container with ID starting with bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8 not found: ID does not exist" containerID="bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8" Apr 16 20:07:28.155537 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.155494 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8"} err="failed to get container status \"bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8\": rpc error: code = NotFound desc = could not find container \"bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8\": container with ID starting with bb83541908ade7a690d826e8b117677d9dd5432cd5b1532e915478be4f52ada8 not found: ID does not exist" Apr 16 20:07:28.155575 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:28.155552 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-hff57"] Apr 16 20:07:29.112015 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:29.111984 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" path="/var/lib/kubelet/pods/9c26d767-e75f-43c1-afb3-c7a4f7c9018b/volumes" Apr 16 20:07:31.061102 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:31.061029 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:07:41.061450 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:41.061398 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:07:51.060927 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:07:51.060879 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:08:01.060587 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:01.060536 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:08:11.061131 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:11.061074 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:08:12.108005 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:12.107956 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:08:22.108949 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:22.108895 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:08:32.109084 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:32.109027 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:08:37.714375 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.714338 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46"] Apr 16 20:08:37.714881 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.714614 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" containerID="cri-o://e7a8a550d75a0bde85bfc9254012811128668737b82745c20e5023b68f610586" gracePeriod=30 Apr 16 20:08:37.814297 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814257 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5"] Apr 16 20:08:37.814741 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814725 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" Apr 16 20:08:37.814784 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814746 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" Apr 16 20:08:37.814784 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814767 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" Apr 16 20:08:37.814784 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814774 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" Apr 16 20:08:37.814905 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814789 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="storage-initializer" Apr 16 20:08:37.814905 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814795 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="storage-initializer" Apr 16 20:08:37.814905 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814861 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="agent" Apr 16 20:08:37.814905 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.814872 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c26d767-e75f-43c1-afb3-c7a4f7c9018b" containerName="kserve-container" Apr 16 20:08:37.817268 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.817245 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:08:37.827978 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.827956 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5"] Apr 16 20:08:37.892324 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.892284 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4b82866-3aa3-43aa-b28b-423ec3248d05-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5\" (UID: \"a4b82866-3aa3-43aa-b28b-423ec3248d05\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:08:37.992913 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.992816 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4b82866-3aa3-43aa-b28b-423ec3248d05-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5\" (UID: \"a4b82866-3aa3-43aa-b28b-423ec3248d05\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:08:37.993250 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:37.993228 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4b82866-3aa3-43aa-b28b-423ec3248d05-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5\" (UID: \"a4b82866-3aa3-43aa-b28b-423ec3248d05\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:08:38.127947 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:38.127899 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:08:38.253306 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:38.253268 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5"] Apr 16 20:08:38.256898 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:08:38.256861 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4b82866_3aa3_43aa_b28b_423ec3248d05.slice/crio-b88aee60f47a46e335770809ffc5bf4583a0de1fc3607cffed6060f83ad8fe6e WatchSource:0}: Error finding container b88aee60f47a46e335770809ffc5bf4583a0de1fc3607cffed6060f83ad8fe6e: Status 404 returned error can't find the container with id b88aee60f47a46e335770809ffc5bf4583a0de1fc3607cffed6060f83ad8fe6e Apr 16 20:08:38.372804 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:38.372765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" event={"ID":"a4b82866-3aa3-43aa-b28b-423ec3248d05","Type":"ContainerStarted","Data":"9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21"} Apr 16 20:08:38.372804 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:38.372804 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" event={"ID":"a4b82866-3aa3-43aa-b28b-423ec3248d05","Type":"ContainerStarted","Data":"b88aee60f47a46e335770809ffc5bf4583a0de1fc3607cffed6060f83ad8fe6e"} Apr 16 20:08:42.107981 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:42.107937 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:08:42.387687 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:42.387645 2570 generic.go:358] "Generic (PLEG): container finished" podID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerID="9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21" exitCode=0 Apr 16 20:08:42.387687 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:42.387688 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" event={"ID":"a4b82866-3aa3-43aa-b28b-423ec3248d05","Type":"ContainerDied","Data":"9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21"} Apr 16 20:08:43.393546 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.393511 2570 generic.go:358] "Generic (PLEG): container finished" podID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerID="e7a8a550d75a0bde85bfc9254012811128668737b82745c20e5023b68f610586" exitCode=0 Apr 16 20:08:43.393980 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.393591 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" event={"ID":"867fb85c-f823-4fea-b334-e6a6da8481b1","Type":"ContainerDied","Data":"e7a8a550d75a0bde85bfc9254012811128668737b82745c20e5023b68f610586"} Apr 16 20:08:43.395341 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.395316 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" event={"ID":"a4b82866-3aa3-43aa-b28b-423ec3248d05","Type":"ContainerStarted","Data":"0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba"} Apr 16 20:08:43.395642 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.395598 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:08:43.396604 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.396580 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:08:43.413075 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.413015 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podStartSLOduration=6.413000151 podStartE2EDuration="6.413000151s" podCreationTimestamp="2026-04-16 20:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:08:43.410949757 +0000 UTC m=+886.945815790" watchObservedRunningTime="2026-04-16 20:08:43.413000151 +0000 UTC m=+886.947866187" Apr 16 20:08:43.463771 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.463747 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:08:43.534892 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.534858 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867fb85c-f823-4fea-b334-e6a6da8481b1-kserve-provision-location\") pod \"867fb85c-f823-4fea-b334-e6a6da8481b1\" (UID: \"867fb85c-f823-4fea-b334-e6a6da8481b1\") " Apr 16 20:08:43.535337 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.535306 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/867fb85c-f823-4fea-b334-e6a6da8481b1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "867fb85c-f823-4fea-b334-e6a6da8481b1" (UID: "867fb85c-f823-4fea-b334-e6a6da8481b1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:08:43.635651 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:43.635604 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867fb85c-f823-4fea-b334-e6a6da8481b1-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:08:44.399888 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:44.399850 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" Apr 16 20:08:44.400378 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:44.399854 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46" event={"ID":"867fb85c-f823-4fea-b334-e6a6da8481b1","Type":"ContainerDied","Data":"872d53fc8967749354b9d05b41e333ae0f0e010dbe516b1544c4baf3c3a15f65"} Apr 16 20:08:44.400378 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:44.400051 2570 scope.go:117] "RemoveContainer" containerID="e7a8a550d75a0bde85bfc9254012811128668737b82745c20e5023b68f610586" Apr 16 20:08:44.400568 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:44.400541 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:08:44.408269 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:44.408248 2570 scope.go:117] "RemoveContainer" containerID="9abae196f2cd0cebe3e192a45d20097b38b508f38dced1db2646536b54f16128" Apr 16 20:08:44.424002 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:44.423969 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46"] Apr 16 20:08:44.426817 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:44.426795 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-klp46"] Apr 16 20:08:45.111618 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:45.111586 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" path="/var/lib/kubelet/pods/867fb85c-f823-4fea-b334-e6a6da8481b1/volumes" Apr 16 20:08:54.400369 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:08:54.400321 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:09:04.400435 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:09:04.400389 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:09:14.401175 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:09:14.401128 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:09:24.400784 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:09:24.400742 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:09:34.400274 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:09:34.400223 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:09:44.400637 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:09:44.400590 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:09:54.401075 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:09:54.401024 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:10:04.401425 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:04.401394 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:10:08.202150 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.202113 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5"] Apr 16 20:10:08.209461 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.202349 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" containerID="cri-o://0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba" gracePeriod=30 Apr 16 20:10:08.271269 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.271231 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz"] Apr 16 20:10:08.271683 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.271667 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="storage-initializer" Apr 16 20:10:08.271732 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.271685 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="storage-initializer" Apr 16 20:10:08.271732 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.271709 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" Apr 16 20:10:08.271732 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.271714 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" Apr 16 20:10:08.271823 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.271771 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="867fb85c-f823-4fea-b334-e6a6da8481b1" containerName="kserve-container" Apr 16 20:10:08.273904 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.273888 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:10:08.285516 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.285492 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz"] Apr 16 20:10:08.372759 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.372723 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/996af208-7f58-45b2-abcc-77487a704dd2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz\" (UID: \"996af208-7f58-45b2-abcc-77487a704dd2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:10:08.473457 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.473343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/996af208-7f58-45b2-abcc-77487a704dd2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz\" (UID: \"996af208-7f58-45b2-abcc-77487a704dd2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:10:08.473796 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.473770 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/996af208-7f58-45b2-abcc-77487a704dd2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz\" (UID: \"996af208-7f58-45b2-abcc-77487a704dd2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:10:08.584776 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.584744 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:10:08.712893 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:08.712868 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz"] Apr 16 20:10:08.715557 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:10:08.715523 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996af208_7f58_45b2_abcc_77487a704dd2.slice/crio-c1f127b17d32521bd3e2ded30c58281a8805303547a556f7e632017f156f7294 WatchSource:0}: Error finding container c1f127b17d32521bd3e2ded30c58281a8805303547a556f7e632017f156f7294: Status 404 returned error can't find the container with id c1f127b17d32521bd3e2ded30c58281a8805303547a556f7e632017f156f7294 Apr 16 20:10:09.702489 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:09.702452 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" event={"ID":"996af208-7f58-45b2-abcc-77487a704dd2","Type":"ContainerStarted","Data":"471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5"} Apr 16 20:10:09.702489 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:09.702496 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" event={"ID":"996af208-7f58-45b2-abcc-77487a704dd2","Type":"ContainerStarted","Data":"c1f127b17d32521bd3e2ded30c58281a8805303547a556f7e632017f156f7294"} Apr 16 20:10:12.715290 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:12.715257 2570 generic.go:358] "Generic (PLEG): container finished" podID="996af208-7f58-45b2-abcc-77487a704dd2" containerID="471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5" exitCode=0 Apr 16 20:10:12.715665 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:12.715331 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" event={"ID":"996af208-7f58-45b2-abcc-77487a704dd2","Type":"ContainerDied","Data":"471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5"} Apr 16 20:10:13.046030 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.045990 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:10:13.110914 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.110878 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4b82866-3aa3-43aa-b28b-423ec3248d05-kserve-provision-location\") pod \"a4b82866-3aa3-43aa-b28b-423ec3248d05\" (UID: \"a4b82866-3aa3-43aa-b28b-423ec3248d05\") " Apr 16 20:10:13.111280 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.111240 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b82866-3aa3-43aa-b28b-423ec3248d05-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a4b82866-3aa3-43aa-b28b-423ec3248d05" (UID: "a4b82866-3aa3-43aa-b28b-423ec3248d05"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:13.211921 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.211811 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4b82866-3aa3-43aa-b28b-423ec3248d05-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:10:13.720661 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.720623 2570 generic.go:358] "Generic (PLEG): container finished" podID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerID="0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba" exitCode=0 Apr 16 20:10:13.721150 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.720689 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" Apr 16 20:10:13.721150 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.720712 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" event={"ID":"a4b82866-3aa3-43aa-b28b-423ec3248d05","Type":"ContainerDied","Data":"0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba"} Apr 16 20:10:13.721150 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.720756 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5" event={"ID":"a4b82866-3aa3-43aa-b28b-423ec3248d05","Type":"ContainerDied","Data":"b88aee60f47a46e335770809ffc5bf4583a0de1fc3607cffed6060f83ad8fe6e"} Apr 16 20:10:13.721150 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.720778 2570 scope.go:117] "RemoveContainer" containerID="0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba" Apr 16 20:10:13.729027 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.729005 2570 scope.go:117] "RemoveContainer" containerID="9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21" Apr 16 20:10:13.739124 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.739028 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5"] Apr 16 20:10:13.740973 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.740928 2570 scope.go:117] "RemoveContainer" containerID="0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba" Apr 16 20:10:13.741901 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:10:13.741721 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba\": container with ID starting with 0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba not found: ID does not exist" containerID="0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba" Apr 16 20:10:13.741901 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.741763 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba"} err="failed to get container status \"0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba\": rpc error: code = NotFound desc = could not find container \"0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba\": container with ID starting with 0df737a76d7de0a54100f26c7d3d94c9ff6846bd04c08dbb7f91cb03632b90ba not found: ID does not exist" Apr 16 20:10:13.741901 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.741791 2570 scope.go:117] "RemoveContainer" containerID="9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21" Apr 16 20:10:13.742957 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.742933 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-xcsx5"] Apr 16 20:10:13.742957 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:10:13.742949 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21\": container with ID starting with 9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21 not found: ID does not exist" containerID="9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21" Apr 16 20:10:13.743131 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:13.742977 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21"} err="failed to get container status \"9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21\": rpc error: code = NotFound desc = could not find container \"9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21\": container with ID starting with 9b83ae8070fd4757798ad31e5e41240a2fa3bab68c15b24f32677b12ac1c5d21 not found: ID does not exist" Apr 16 20:10:15.118139 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:10:15.118068 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" path="/var/lib/kubelet/pods/a4b82866-3aa3-43aa-b28b-423ec3248d05/volumes" Apr 16 20:12:33.255593 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:12:33.255557 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" event={"ID":"996af208-7f58-45b2-abcc-77487a704dd2","Type":"ContainerStarted","Data":"779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429"} Apr 16 20:12:33.256011 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:12:33.255730 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:12:33.284136 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:12:33.284083 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" podStartSLOduration=5.161630714 podStartE2EDuration="2m25.284048369s" podCreationTimestamp="2026-04-16 20:10:08 +0000 UTC" firstStartedPulling="2026-04-16 20:10:12.716342075 +0000 UTC m=+976.251208093" lastFinishedPulling="2026-04-16 20:12:32.838759734 +0000 UTC m=+1116.373625748" observedRunningTime="2026-04-16 20:12:33.281784248 +0000 UTC m=+1116.816650285" watchObservedRunningTime="2026-04-16 20:12:33.284048369 +0000 UTC m=+1116.818914475" Apr 16 20:13:04.265230 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:04.265195 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:13:08.521244 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.521212 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz"] Apr 16 20:13:08.521659 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.521468 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" podUID="996af208-7f58-45b2-abcc-77487a704dd2" containerName="kserve-container" containerID="cri-o://779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429" gracePeriod=30 Apr 16 20:13:08.607450 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.607414 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b"] Apr 16 20:13:08.607914 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.607877 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="storage-initializer" Apr 16 20:13:08.607914 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.607910 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="storage-initializer" Apr 16 20:13:08.608095 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.607939 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" Apr 16 20:13:08.608095 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.607949 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" Apr 16 20:13:08.608095 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.608044 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4b82866-3aa3-43aa-b28b-423ec3248d05" containerName="kserve-container" Apr 16 20:13:08.632000 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.631959 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b"] Apr 16 20:13:08.632201 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.632175 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:08.740032 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.739992 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/716cbe2e-3326-45e4-b2ec-b85d30f83421-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b\" (UID: \"716cbe2e-3326-45e4-b2ec-b85d30f83421\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:08.840776 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.840662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/716cbe2e-3326-45e4-b2ec-b85d30f83421-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b\" (UID: \"716cbe2e-3326-45e4-b2ec-b85d30f83421\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:08.841146 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.841125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/716cbe2e-3326-45e4-b2ec-b85d30f83421-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b\" (UID: \"716cbe2e-3326-45e4-b2ec-b85d30f83421\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:08.945949 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:08.945914 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:09.079290 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:09.079263 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b"] Apr 16 20:13:09.081677 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:13:09.081643 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod716cbe2e_3326_45e4_b2ec_b85d30f83421.slice/crio-fd18a2d85da9333ea84a6e542b7eb227261b427b8cf131c89e0f53fb98172d61 WatchSource:0}: Error finding container fd18a2d85da9333ea84a6e542b7eb227261b427b8cf131c89e0f53fb98172d61: Status 404 returned error can't find the container with id fd18a2d85da9333ea84a6e542b7eb227261b427b8cf131c89e0f53fb98172d61 Apr 16 20:13:09.083437 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:09.083422 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:13:09.381343 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:09.381227 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" event={"ID":"716cbe2e-3326-45e4-b2ec-b85d30f83421","Type":"ContainerStarted","Data":"5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845"} Apr 16 20:13:09.381343 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:09.381290 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" event={"ID":"716cbe2e-3326-45e4-b2ec-b85d30f83421","Type":"ContainerStarted","Data":"fd18a2d85da9333ea84a6e542b7eb227261b427b8cf131c89e0f53fb98172d61"} Apr 16 20:13:09.665143 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:09.665118 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:13:09.748683 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:09.748636 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/996af208-7f58-45b2-abcc-77487a704dd2-kserve-provision-location\") pod \"996af208-7f58-45b2-abcc-77487a704dd2\" (UID: \"996af208-7f58-45b2-abcc-77487a704dd2\") " Apr 16 20:13:09.748979 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:09.748954 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996af208-7f58-45b2-abcc-77487a704dd2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "996af208-7f58-45b2-abcc-77487a704dd2" (UID: "996af208-7f58-45b2-abcc-77487a704dd2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:09.849850 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:09.849809 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/996af208-7f58-45b2-abcc-77487a704dd2-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:13:10.386004 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.385965 2570 generic.go:358] "Generic (PLEG): container finished" podID="996af208-7f58-45b2-abcc-77487a704dd2" containerID="779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429" exitCode=0 Apr 16 20:13:10.386194 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.386026 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" Apr 16 20:13:10.386194 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.386051 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" event={"ID":"996af208-7f58-45b2-abcc-77487a704dd2","Type":"ContainerDied","Data":"779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429"} Apr 16 20:13:10.386194 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.386105 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz" event={"ID":"996af208-7f58-45b2-abcc-77487a704dd2","Type":"ContainerDied","Data":"c1f127b17d32521bd3e2ded30c58281a8805303547a556f7e632017f156f7294"} Apr 16 20:13:10.386194 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.386124 2570 scope.go:117] "RemoveContainer" containerID="779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429" Apr 16 20:13:10.394911 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.394889 2570 scope.go:117] "RemoveContainer" containerID="471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5" Apr 16 20:13:10.402489 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.402468 2570 scope.go:117] "RemoveContainer" containerID="779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429" Apr 16 20:13:10.402811 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:13:10.402788 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429\": container with ID starting with 779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429 not found: ID does not exist" containerID="779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429" Apr 16 20:13:10.402963 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.402821 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429"} err="failed to get container status \"779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429\": rpc error: code = NotFound desc = could not find container \"779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429\": container with ID starting with 779052aa9d178495873325335896422add206a5f363bbab8dd176633ded8f429 not found: ID does not exist" Apr 16 20:13:10.402963 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.402841 2570 scope.go:117] "RemoveContainer" containerID="471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5" Apr 16 20:13:10.403096 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:13:10.403069 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5\": container with ID starting with 471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5 not found: ID does not exist" containerID="471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5" Apr 16 20:13:10.403096 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.403087 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5"} err="failed to get container status \"471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5\": rpc error: code = NotFound desc = could not find container \"471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5\": container with ID starting with 471ebcfe268e7140258099bd8af9df429d600a377f241f04c664ef71ef32dce5 not found: ID does not exist" Apr 16 20:13:10.408903 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.408878 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz"] Apr 16 20:13:10.412194 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:10.412170 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-pstkz"] Apr 16 20:13:11.111635 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:11.111604 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996af208-7f58-45b2-abcc-77487a704dd2" path="/var/lib/kubelet/pods/996af208-7f58-45b2-abcc-77487a704dd2/volumes" Apr 16 20:13:13.397563 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:13.397526 2570 generic.go:358] "Generic (PLEG): container finished" podID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerID="5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845" exitCode=0 Apr 16 20:13:13.398047 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:13.397570 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" event={"ID":"716cbe2e-3326-45e4-b2ec-b85d30f83421","Type":"ContainerDied","Data":"5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845"} Apr 16 20:13:14.402652 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:14.402614 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" event={"ID":"716cbe2e-3326-45e4-b2ec-b85d30f83421","Type":"ContainerStarted","Data":"7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3"} Apr 16 20:13:14.403156 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:14.402900 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:14.404421 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:14.404379 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:13:14.421601 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:14.421536 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" podStartSLOduration=6.421514502 podStartE2EDuration="6.421514502s" podCreationTimestamp="2026-04-16 20:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:14.420108457 +0000 UTC m=+1157.954974494" watchObservedRunningTime="2026-04-16 20:13:14.421514502 +0000 UTC m=+1157.956380539" Apr 16 20:13:15.406119 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:15.406019 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:13:25.407093 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:25.407034 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:28.697045 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.697014 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b"] Apr 16 20:13:28.697429 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.697282 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerName="kserve-container" containerID="cri-o://7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3" gracePeriod=30 Apr 16 20:13:28.729049 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.729013 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb"] Apr 16 20:13:28.729458 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.729442 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="996af208-7f58-45b2-abcc-77487a704dd2" containerName="kserve-container" Apr 16 20:13:28.729504 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.729459 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="996af208-7f58-45b2-abcc-77487a704dd2" containerName="kserve-container" Apr 16 20:13:28.729504 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.729476 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="996af208-7f58-45b2-abcc-77487a704dd2" containerName="storage-initializer" Apr 16 20:13:28.729504 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.729481 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="996af208-7f58-45b2-abcc-77487a704dd2" containerName="storage-initializer" Apr 16 20:13:28.729602 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.729537 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="996af208-7f58-45b2-abcc-77487a704dd2" containerName="kserve-container" Apr 16 20:13:28.732611 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.732595 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:13:28.739754 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.739725 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb"] Apr 16 20:13:28.822864 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.822827 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb55659-b4a1-4c7e-ae01-b4fb96e75e37-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb\" (UID: \"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:13:28.924161 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.924115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb55659-b4a1-4c7e-ae01-b4fb96e75e37-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb\" (UID: \"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:13:28.924505 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:28.924485 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb55659-b4a1-4c7e-ae01-b4fb96e75e37-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb\" (UID: \"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:13:29.045524 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.045489 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:13:29.182993 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.182790 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb"] Apr 16 20:13:29.185919 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:13:29.185884 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb55659_b4a1_4c7e_ae01_b4fb96e75e37.slice/crio-3f16c4014d4b1ef2c1566a4acfd4fba52ec3ac0ca4e188ff2e86e26bdc5d059c WatchSource:0}: Error finding container 3f16c4014d4b1ef2c1566a4acfd4fba52ec3ac0ca4e188ff2e86e26bdc5d059c: Status 404 returned error can't find the container with id 3f16c4014d4b1ef2c1566a4acfd4fba52ec3ac0ca4e188ff2e86e26bdc5d059c Apr 16 20:13:29.440915 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.440891 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:29.459823 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.459412 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" event={"ID":"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37","Type":"ContainerStarted","Data":"0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f"} Apr 16 20:13:29.459823 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.459465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" event={"ID":"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37","Type":"ContainerStarted","Data":"3f16c4014d4b1ef2c1566a4acfd4fba52ec3ac0ca4e188ff2e86e26bdc5d059c"} Apr 16 20:13:29.461748 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.461672 2570 generic.go:358] "Generic (PLEG): container finished" podID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerID="7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3" exitCode=0 Apr 16 20:13:29.461881 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.461746 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" Apr 16 20:13:29.461881 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.461747 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" event={"ID":"716cbe2e-3326-45e4-b2ec-b85d30f83421","Type":"ContainerDied","Data":"7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3"} Apr 16 20:13:29.461881 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.461869 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b" event={"ID":"716cbe2e-3326-45e4-b2ec-b85d30f83421","Type":"ContainerDied","Data":"fd18a2d85da9333ea84a6e542b7eb227261b427b8cf131c89e0f53fb98172d61"} Apr 16 20:13:29.462005 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.461902 2570 scope.go:117] "RemoveContainer" containerID="7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3" Apr 16 20:13:29.471368 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.471341 2570 scope.go:117] "RemoveContainer" containerID="5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845" Apr 16 20:13:29.480117 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.480093 2570 scope.go:117] "RemoveContainer" containerID="7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3" Apr 16 20:13:29.480398 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:13:29.480377 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3\": container with ID starting with 7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3 not found: ID does not exist" containerID="7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3" Apr 16 20:13:29.480480 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.480406 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3"} err="failed to get container status \"7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3\": rpc error: code = NotFound desc = could not find container \"7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3\": container with ID starting with 7c2b54269892633de534147775ae16006ebc28927b60f515e7f422952c119bd3 not found: ID does not exist" Apr 16 20:13:29.480480 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.480425 2570 scope.go:117] "RemoveContainer" containerID="5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845" Apr 16 20:13:29.480690 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:13:29.480664 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845\": container with ID starting with 5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845 not found: ID does not exist" containerID="5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845" Apr 16 20:13:29.480733 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.480702 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845"} err="failed to get container status \"5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845\": rpc error: code = NotFound desc = could not find container \"5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845\": container with ID starting with 5e72305a740deb79151acb1627a00d12fe65b5c69489e8a8b6621e55cf874845 not found: ID does not exist" Apr 16 20:13:29.530150 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.530107 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/716cbe2e-3326-45e4-b2ec-b85d30f83421-kserve-provision-location\") pod \"716cbe2e-3326-45e4-b2ec-b85d30f83421\" (UID: \"716cbe2e-3326-45e4-b2ec-b85d30f83421\") " Apr 16 20:13:29.530477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.530452 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/716cbe2e-3326-45e4-b2ec-b85d30f83421-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "716cbe2e-3326-45e4-b2ec-b85d30f83421" (UID: "716cbe2e-3326-45e4-b2ec-b85d30f83421"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:29.631372 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.631277 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/716cbe2e-3326-45e4-b2ec-b85d30f83421-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:13:29.786754 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.786724 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b"] Apr 16 20:13:29.790488 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:29.790461 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-lxt5b"] Apr 16 20:13:31.111923 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:31.111881 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" path="/var/lib/kubelet/pods/716cbe2e-3326-45e4-b2ec-b85d30f83421/volumes" Apr 16 20:13:33.478959 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:33.478916 2570 generic.go:358] "Generic (PLEG): container finished" podID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" containerID="0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f" exitCode=0 Apr 16 20:13:33.479352 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:33.478994 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" event={"ID":"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37","Type":"ContainerDied","Data":"0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f"} Apr 16 20:13:34.485392 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:34.485350 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" event={"ID":"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37","Type":"ContainerStarted","Data":"f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef"} Apr 16 20:13:34.485786 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:34.485586 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:13:34.503373 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:13:34.503312 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" podStartSLOduration=6.503290295 podStartE2EDuration="6.503290295s" podCreationTimestamp="2026-04-16 20:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:34.502343356 +0000 UTC m=+1178.037209395" watchObservedRunningTime="2026-04-16 20:13:34.503290295 +0000 UTC m=+1178.038156333" Apr 16 20:14:05.494671 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:05.494639 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:14:08.869104 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.869048 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb"] Apr 16 20:14:08.869497 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.869423 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" podUID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" containerName="kserve-container" containerID="cri-o://f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef" gracePeriod=30 Apr 16 20:14:08.927626 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.927588 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m"] Apr 16 20:14:08.928000 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.927987 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerName="kserve-container" Apr 16 20:14:08.928078 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.928002 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerName="kserve-container" Apr 16 20:14:08.928078 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.928018 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerName="storage-initializer" Apr 16 20:14:08.928078 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.928024 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerName="storage-initializer" Apr 16 20:14:08.928286 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.928106 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="716cbe2e-3326-45e4-b2ec-b85d30f83421" containerName="kserve-container" Apr 16 20:14:08.931406 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.931384 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:14:08.941463 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:08.941432 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m"] Apr 16 20:14:09.076031 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:09.075993 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m\" (UID: \"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:14:09.176808 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:09.176712 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m\" (UID: \"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:14:09.177118 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:09.177097 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m\" (UID: \"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:14:09.242810 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:09.242758 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:14:09.378288 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:09.378261 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m"] Apr 16 20:14:09.380670 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:14:09.380640 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f5dbe4a_60e7_4c3b_99ad_2dbfd811ae81.slice/crio-48fdc7491e0814498122a99a53cd7cad62884c93b2876105d25a63371a893ba9 WatchSource:0}: Error finding container 48fdc7491e0814498122a99a53cd7cad62884c93b2876105d25a63371a893ba9: Status 404 returned error can't find the container with id 48fdc7491e0814498122a99a53cd7cad62884c93b2876105d25a63371a893ba9 Apr 16 20:14:09.614969 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:09.614931 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" event={"ID":"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81","Type":"ContainerStarted","Data":"d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2"} Apr 16 20:14:09.615162 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:09.614977 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" event={"ID":"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81","Type":"ContainerStarted","Data":"48fdc7491e0814498122a99a53cd7cad62884c93b2876105d25a63371a893ba9"} Apr 16 20:14:10.194101 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.194071 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:14:10.285133 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.285047 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb55659-b4a1-4c7e-ae01-b4fb96e75e37-kserve-provision-location\") pod \"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37\" (UID: \"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37\") " Apr 16 20:14:10.285411 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.285383 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb55659-b4a1-4c7e-ae01-b4fb96e75e37-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" (UID: "ceb55659-b4a1-4c7e-ae01-b4fb96e75e37"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:10.386682 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.386580 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb55659-b4a1-4c7e-ae01-b4fb96e75e37-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:14:10.620000 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.619968 2570 generic.go:358] "Generic (PLEG): container finished" podID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" containerID="f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef" exitCode=0 Apr 16 20:14:10.620192 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.620033 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" Apr 16 20:14:10.620192 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.620049 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" event={"ID":"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37","Type":"ContainerDied","Data":"f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef"} Apr 16 20:14:10.620192 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.620097 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb" event={"ID":"ceb55659-b4a1-4c7e-ae01-b4fb96e75e37","Type":"ContainerDied","Data":"3f16c4014d4b1ef2c1566a4acfd4fba52ec3ac0ca4e188ff2e86e26bdc5d059c"} Apr 16 20:14:10.620192 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.620117 2570 scope.go:117] "RemoveContainer" containerID="f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef" Apr 16 20:14:10.628856 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.628838 2570 scope.go:117] "RemoveContainer" containerID="0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f" Apr 16 20:14:10.635980 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.635963 2570 scope.go:117] "RemoveContainer" containerID="f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef" Apr 16 20:14:10.636264 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:14:10.636245 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef\": container with ID starting with f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef not found: ID does not exist" containerID="f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef" Apr 16 20:14:10.636360 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.636273 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef"} err="failed to get container status \"f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef\": rpc error: code = NotFound desc = could not find container \"f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef\": container with ID starting with f357b212c82685917959f0edfc648643aed3473a44d77bb1245c639e55e00bef not found: ID does not exist" Apr 16 20:14:10.636360 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.636296 2570 scope.go:117] "RemoveContainer" containerID="0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f" Apr 16 20:14:10.636529 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:14:10.636511 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f\": container with ID starting with 0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f not found: ID does not exist" containerID="0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f" Apr 16 20:14:10.636568 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.636536 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f"} err="failed to get container status \"0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f\": rpc error: code = NotFound desc = could not find container \"0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f\": container with ID starting with 0833816a87dedaf1aecd2b19dd025668c3403677703876f089e94ab70764837f not found: ID does not exist" Apr 16 20:14:10.642088 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.642015 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb"] Apr 16 20:14:10.646658 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:10.646636 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-wrtvb"] Apr 16 20:14:11.111711 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:11.111675 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" path="/var/lib/kubelet/pods/ceb55659-b4a1-4c7e-ae01-b4fb96e75e37/volumes" Apr 16 20:14:13.633468 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:13.633432 2570 generic.go:358] "Generic (PLEG): container finished" podID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerID="d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2" exitCode=0 Apr 16 20:14:13.633959 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:13.633509 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" event={"ID":"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81","Type":"ContainerDied","Data":"d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2"} Apr 16 20:14:14.639162 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:14.639083 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" event={"ID":"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81","Type":"ContainerStarted","Data":"e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225"} Apr 16 20:14:17.658498 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:17.658460 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" event={"ID":"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81","Type":"ContainerStarted","Data":"e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d"} Apr 16 20:14:17.658899 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:17.658595 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:14:17.678311 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:17.678258 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" podStartSLOduration=6.443385511 podStartE2EDuration="9.678224108s" podCreationTimestamp="2026-04-16 20:14:08 +0000 UTC" firstStartedPulling="2026-04-16 20:14:13.704612037 +0000 UTC m=+1217.239478051" lastFinishedPulling="2026-04-16 20:14:16.939450632 +0000 UTC m=+1220.474316648" observedRunningTime="2026-04-16 20:14:17.676883298 +0000 UTC m=+1221.211749335" watchObservedRunningTime="2026-04-16 20:14:17.678224108 +0000 UTC m=+1221.213090145" Apr 16 20:14:18.662630 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:18.662599 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:14:49.668685 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:14:49.668599 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:15:19.669890 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:19.669859 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:15:29.016846 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.016806 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m"] Apr 16 20:15:29.017436 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.017169 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-container" containerID="cri-o://e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225" gracePeriod=30 Apr 16 20:15:29.017436 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.017225 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-agent" containerID="cri-o://e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d" gracePeriod=30 Apr 16 20:15:29.069037 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.068998 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb"] Apr 16 20:15:29.069399 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.069386 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" containerName="kserve-container" Apr 16 20:15:29.069399 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.069400 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" containerName="kserve-container" Apr 16 20:15:29.069493 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.069412 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" containerName="storage-initializer" Apr 16 20:15:29.069493 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.069418 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" containerName="storage-initializer" Apr 16 20:15:29.069493 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.069486 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ceb55659-b4a1-4c7e-ae01-b4fb96e75e37" containerName="kserve-container" Apr 16 20:15:29.072808 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.072791 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:15:29.081138 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.081109 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb"] Apr 16 20:15:29.183001 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.182950 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1025b52d-7b29-463d-ade9-fae4353b6a59-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-pp8wb\" (UID: \"1025b52d-7b29-463d-ade9-fae4353b6a59\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:15:29.284460 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.284353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1025b52d-7b29-463d-ade9-fae4353b6a59-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-pp8wb\" (UID: \"1025b52d-7b29-463d-ade9-fae4353b6a59\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:15:29.284793 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.284768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1025b52d-7b29-463d-ade9-fae4353b6a59-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-pp8wb\" (UID: \"1025b52d-7b29-463d-ade9-fae4353b6a59\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:15:29.384653 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.384610 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:15:29.506936 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.506900 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb"] Apr 16 20:15:29.509738 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:15:29.509706 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1025b52d_7b29_463d_ade9_fae4353b6a59.slice/crio-903404493e7b221bbeff600affa292749f385166c1c679b9e3f3bba5a1202903 WatchSource:0}: Error finding container 903404493e7b221bbeff600affa292749f385166c1c679b9e3f3bba5a1202903: Status 404 returned error can't find the container with id 903404493e7b221bbeff600affa292749f385166c1c679b9e3f3bba5a1202903 Apr 16 20:15:29.666194 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.666146 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:15:29.903506 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.903423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" event={"ID":"1025b52d-7b29-463d-ade9-fae4353b6a59","Type":"ContainerStarted","Data":"67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e"} Apr 16 20:15:29.903506 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:29.903460 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" event={"ID":"1025b52d-7b29-463d-ade9-fae4353b6a59","Type":"ContainerStarted","Data":"903404493e7b221bbeff600affa292749f385166c1c679b9e3f3bba5a1202903"} Apr 16 20:15:31.911941 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:31.911902 2570 generic.go:358] "Generic (PLEG): container finished" podID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerID="e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225" exitCode=0 Apr 16 20:15:31.912340 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:31.911956 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" event={"ID":"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81","Type":"ContainerDied","Data":"e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225"} Apr 16 20:15:34.924327 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:34.924289 2570 generic.go:358] "Generic (PLEG): container finished" podID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerID="67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e" exitCode=0 Apr 16 20:15:34.924703 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:34.924342 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" event={"ID":"1025b52d-7b29-463d-ade9-fae4353b6a59","Type":"ContainerDied","Data":"67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e"} Apr 16 20:15:39.666120 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:39.666041 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:15:46.972215 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:46.972120 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" event={"ID":"1025b52d-7b29-463d-ade9-fae4353b6a59","Type":"ContainerStarted","Data":"cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39"} Apr 16 20:15:46.972597 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:46.972423 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:15:46.973797 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:46.973771 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:15:46.995906 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:46.995839 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" podStartSLOduration=6.229626088 podStartE2EDuration="17.995825988s" podCreationTimestamp="2026-04-16 20:15:29 +0000 UTC" firstStartedPulling="2026-04-16 20:15:34.925469219 +0000 UTC m=+1298.460335232" lastFinishedPulling="2026-04-16 20:15:46.691669118 +0000 UTC m=+1310.226535132" observedRunningTime="2026-04-16 20:15:46.992898813 +0000 UTC m=+1310.527764855" watchObservedRunningTime="2026-04-16 20:15:46.995825988 +0000 UTC m=+1310.530692023" Apr 16 20:15:47.976928 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:47.976888 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:15:49.666230 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:49.666184 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:15:49.666628 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:49.666333 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:15:57.976973 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:57.976926 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:15:59.177436 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:59.177412 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:15:59.243221 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:59.243177 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81-kserve-provision-location\") pod \"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81\" (UID: \"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81\") " Apr 16 20:15:59.243545 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:59.243516 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" (UID: "6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:15:59.344658 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:15:59.344567 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:16:00.019697 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.019657 2570 generic.go:358] "Generic (PLEG): container finished" podID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerID="e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d" exitCode=0 Apr 16 20:16:00.019971 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.019712 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" event={"ID":"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81","Type":"ContainerDied","Data":"e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d"} Apr 16 20:16:00.019971 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.019739 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" Apr 16 20:16:00.019971 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.019751 2570 scope.go:117] "RemoveContainer" containerID="e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d" Apr 16 20:16:00.019971 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.019740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m" event={"ID":"6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81","Type":"ContainerDied","Data":"48fdc7491e0814498122a99a53cd7cad62884c93b2876105d25a63371a893ba9"} Apr 16 20:16:00.028304 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.028287 2570 scope.go:117] "RemoveContainer" containerID="e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225" Apr 16 20:16:00.035960 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.035940 2570 scope.go:117] "RemoveContainer" containerID="d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2" Apr 16 20:16:00.041729 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.041705 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m"] Apr 16 20:16:00.044819 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.044795 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-g9l6m"] Apr 16 20:16:00.045771 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.045748 2570 scope.go:117] "RemoveContainer" containerID="e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d" Apr 16 20:16:00.046123 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:16:00.046042 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d\": container with ID starting with e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d not found: ID does not exist" containerID="e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d" Apr 16 20:16:00.046123 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.046082 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d"} err="failed to get container status \"e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d\": rpc error: code = NotFound desc = could not find container \"e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d\": container with ID starting with e6d322b07033cce43162cb3634eaec129317b532eaf438bf9d2df9cbc007f96d not found: ID does not exist" Apr 16 20:16:00.046123 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.046106 2570 scope.go:117] "RemoveContainer" containerID="e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225" Apr 16 20:16:00.046437 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:16:00.046417 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225\": container with ID starting with e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225 not found: ID does not exist" containerID="e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225" Apr 16 20:16:00.046488 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.046443 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225"} err="failed to get container status \"e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225\": rpc error: code = NotFound desc = could not find container \"e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225\": container with ID starting with e318dfa2d88b5eeedfaa971ab97b38a73b796483ea7e4e975cfcf4bb5ca63225 not found: ID does not exist" Apr 16 20:16:00.046488 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.046461 2570 scope.go:117] "RemoveContainer" containerID="d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2" Apr 16 20:16:00.046710 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:16:00.046692 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2\": container with ID starting with d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2 not found: ID does not exist" containerID="d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2" Apr 16 20:16:00.046779 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:00.046714 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2"} err="failed to get container status \"d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2\": rpc error: code = NotFound desc = could not find container \"d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2\": container with ID starting with d8e2e8c2545ae2b5f24c4a9b3042c1a5c1f683bb004913e08046b6015c75e5c2 not found: ID does not exist" Apr 16 20:16:01.111342 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:01.111302 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" path="/var/lib/kubelet/pods/6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81/volumes" Apr 16 20:16:07.977744 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:07.977701 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:16:17.977869 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:17.977766 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:16:27.977505 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:27.977457 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:16:37.978260 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:37.978232 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:16:40.573938 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.573902 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb"] Apr 16 20:16:40.574408 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.574175 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" containerID="cri-o://cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39" gracePeriod=30 Apr 16 20:16:40.671953 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.671911 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb"] Apr 16 20:16:40.672425 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.672404 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="storage-initializer" Apr 16 20:16:40.672549 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.672428 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="storage-initializer" Apr 16 20:16:40.672549 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.672451 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-container" Apr 16 20:16:40.672549 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.672459 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-container" Apr 16 20:16:40.672549 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.672478 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-agent" Apr 16 20:16:40.672549 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.672486 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-agent" Apr 16 20:16:40.672806 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.672564 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-agent" Apr 16 20:16:40.672806 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.672579 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f5dbe4a-60e7-4c3b-99ad-2dbfd811ae81" containerName="kserve-container" Apr 16 20:16:40.675709 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.675687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:16:40.687728 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.687701 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb"] Apr 16 20:16:40.816726 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.816691 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30322e29-bbd2-41cb-a8c2-146420f13eb1-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-jxngb\" (UID: \"30322e29-bbd2-41cb-a8c2-146420f13eb1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:16:40.917544 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.917436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30322e29-bbd2-41cb-a8c2-146420f13eb1-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-jxngb\" (UID: \"30322e29-bbd2-41cb-a8c2-146420f13eb1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:16:40.917949 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.917919 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30322e29-bbd2-41cb-a8c2-146420f13eb1-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-jxngb\" (UID: \"30322e29-bbd2-41cb-a8c2-146420f13eb1\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:16:40.986143 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:40.986100 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:16:41.119427 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:41.119396 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb"] Apr 16 20:16:41.121121 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:16:41.121091 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30322e29_bbd2_41cb_a8c2_146420f13eb1.slice/crio-7c36fa0815b72dbeeaf8a5d66c37895075a833339c7ae8ec65e6a62203eea297 WatchSource:0}: Error finding container 7c36fa0815b72dbeeaf8a5d66c37895075a833339c7ae8ec65e6a62203eea297: Status 404 returned error can't find the container with id 7c36fa0815b72dbeeaf8a5d66c37895075a833339c7ae8ec65e6a62203eea297 Apr 16 20:16:41.168657 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:41.168583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" event={"ID":"30322e29-bbd2-41cb-a8c2-146420f13eb1","Type":"ContainerStarted","Data":"7c36fa0815b72dbeeaf8a5d66c37895075a833339c7ae8ec65e6a62203eea297"} Apr 16 20:16:42.173197 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:42.173160 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" event={"ID":"30322e29-bbd2-41cb-a8c2-146420f13eb1","Type":"ContainerStarted","Data":"9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e"} Apr 16 20:16:43.420097 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:43.420050 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:16:43.540006 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:43.539971 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1025b52d-7b29-463d-ade9-fae4353b6a59-kserve-provision-location\") pod \"1025b52d-7b29-463d-ade9-fae4353b6a59\" (UID: \"1025b52d-7b29-463d-ade9-fae4353b6a59\") " Apr 16 20:16:43.549517 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:43.549487 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1025b52d-7b29-463d-ade9-fae4353b6a59-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1025b52d-7b29-463d-ade9-fae4353b6a59" (UID: "1025b52d-7b29-463d-ade9-fae4353b6a59"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:43.641237 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:43.641196 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1025b52d-7b29-463d-ade9-fae4353b6a59-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:16:44.180432 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.180394 2570 generic.go:358] "Generic (PLEG): container finished" podID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerID="cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39" exitCode=0 Apr 16 20:16:44.180632 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.180470 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" Apr 16 20:16:44.180632 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.180476 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" event={"ID":"1025b52d-7b29-463d-ade9-fae4353b6a59","Type":"ContainerDied","Data":"cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39"} Apr 16 20:16:44.180632 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.180505 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb" event={"ID":"1025b52d-7b29-463d-ade9-fae4353b6a59","Type":"ContainerDied","Data":"903404493e7b221bbeff600affa292749f385166c1c679b9e3f3bba5a1202903"} Apr 16 20:16:44.180632 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.180524 2570 scope.go:117] "RemoveContainer" containerID="cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39" Apr 16 20:16:44.188846 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.188823 2570 scope.go:117] "RemoveContainer" containerID="67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e" Apr 16 20:16:44.196669 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.196650 2570 scope.go:117] "RemoveContainer" containerID="cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39" Apr 16 20:16:44.196919 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:16:44.196896 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39\": container with ID starting with cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39 not found: ID does not exist" containerID="cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39" Apr 16 20:16:44.196968 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.196929 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39"} err="failed to get container status \"cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39\": rpc error: code = NotFound desc = could not find container \"cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39\": container with ID starting with cec5be468c5bb4a048d3695f4fe83e25d9524b211dea84634b6f75e52a6ecf39 not found: ID does not exist" Apr 16 20:16:44.196968 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.196949 2570 scope.go:117] "RemoveContainer" containerID="67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e" Apr 16 20:16:44.197185 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:16:44.197166 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e\": container with ID starting with 67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e not found: ID does not exist" containerID="67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e" Apr 16 20:16:44.197250 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.197194 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e"} err="failed to get container status \"67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e\": rpc error: code = NotFound desc = could not find container \"67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e\": container with ID starting with 67c8bde476d7d9374f344b1d045df541b60c1164fb4bfbadf8712cd19974e12e not found: ID does not exist" Apr 16 20:16:44.203179 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.203155 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb"] Apr 16 20:16:44.207590 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:44.207566 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-pp8wb"] Apr 16 20:16:45.111755 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:45.111711 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" path="/var/lib/kubelet/pods/1025b52d-7b29-463d-ade9-fae4353b6a59/volumes" Apr 16 20:16:46.189536 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:46.189505 2570 generic.go:358] "Generic (PLEG): container finished" podID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerID="9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e" exitCode=0 Apr 16 20:16:46.189942 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:46.189576 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" event={"ID":"30322e29-bbd2-41cb-a8c2-146420f13eb1","Type":"ContainerDied","Data":"9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e"} Apr 16 20:16:47.194803 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:47.194765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" event={"ID":"30322e29-bbd2-41cb-a8c2-146420f13eb1","Type":"ContainerStarted","Data":"60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea"} Apr 16 20:16:47.195215 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:47.195090 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:16:47.196533 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:47.196503 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:16:48.198973 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:48.198935 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:16:58.199127 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:16:58.199051 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:17:08.199673 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:08.199627 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:17:18.199257 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:18.199207 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:17:28.199165 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:28.199117 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:17:38.200249 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:38.200214 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:17:38.217838 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:38.217788 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" podStartSLOduration=58.217773421 podStartE2EDuration="58.217773421s" podCreationTimestamp="2026-04-16 20:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:47.243795325 +0000 UTC m=+1370.778661360" watchObservedRunningTime="2026-04-16 20:17:38.217773421 +0000 UTC m=+1421.752639525" Apr 16 20:17:42.186826 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.186786 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb"] Apr 16 20:17:42.187246 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.187082 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" containerID="cri-o://60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea" gracePeriod=30 Apr 16 20:17:42.271594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.271551 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h"] Apr 16 20:17:42.271996 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.271980 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" Apr 16 20:17:42.271996 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.271997 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" Apr 16 20:17:42.272124 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.272015 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="storage-initializer" Apr 16 20:17:42.272124 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.272021 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="storage-initializer" Apr 16 20:17:42.272124 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.272090 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1025b52d-7b29-463d-ade9-fae4353b6a59" containerName="kserve-container" Apr 16 20:17:42.275259 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.275243 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:17:42.282390 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.282353 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h"] Apr 16 20:17:42.343256 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.343215 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02c8119b-7b5d-490c-ab1f-c93f82fae7be-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h\" (UID: \"02c8119b-7b5d-490c-ab1f-c93f82fae7be\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:17:42.444763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.444652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02c8119b-7b5d-490c-ab1f-c93f82fae7be-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h\" (UID: \"02c8119b-7b5d-490c-ab1f-c93f82fae7be\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:17:42.445074 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.445032 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02c8119b-7b5d-490c-ab1f-c93f82fae7be-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h\" (UID: \"02c8119b-7b5d-490c-ab1f-c93f82fae7be\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:17:42.586604 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.586561 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:17:42.713396 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:42.713369 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h"] Apr 16 20:17:42.716288 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:17:42.716252 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c8119b_7b5d_490c_ab1f_c93f82fae7be.slice/crio-c4dc383caea0e393a7e7e2eef44c5053df648816b3eac34509799c5234189a9d WatchSource:0}: Error finding container c4dc383caea0e393a7e7e2eef44c5053df648816b3eac34509799c5234189a9d: Status 404 returned error can't find the container with id c4dc383caea0e393a7e7e2eef44c5053df648816b3eac34509799c5234189a9d Apr 16 20:17:43.384430 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:43.384385 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" event={"ID":"02c8119b-7b5d-490c-ab1f-c93f82fae7be","Type":"ContainerStarted","Data":"e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2"} Apr 16 20:17:43.384430 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:43.384436 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" event={"ID":"02c8119b-7b5d-490c-ab1f-c93f82fae7be","Type":"ContainerStarted","Data":"c4dc383caea0e393a7e7e2eef44c5053df648816b3eac34509799c5234189a9d"} Apr 16 20:17:45.130866 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.130842 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:17:45.168938 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.168901 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30322e29-bbd2-41cb-a8c2-146420f13eb1-kserve-provision-location\") pod \"30322e29-bbd2-41cb-a8c2-146420f13eb1\" (UID: \"30322e29-bbd2-41cb-a8c2-146420f13eb1\") " Apr 16 20:17:45.178742 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.178705 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30322e29-bbd2-41cb-a8c2-146420f13eb1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30322e29-bbd2-41cb-a8c2-146420f13eb1" (UID: "30322e29-bbd2-41cb-a8c2-146420f13eb1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:45.270357 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.270318 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30322e29-bbd2-41cb-a8c2-146420f13eb1-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:17:45.392970 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.392885 2570 generic.go:358] "Generic (PLEG): container finished" podID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerID="60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea" exitCode=0 Apr 16 20:17:45.392970 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.392954 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" Apr 16 20:17:45.393240 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.392972 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" event={"ID":"30322e29-bbd2-41cb-a8c2-146420f13eb1","Type":"ContainerDied","Data":"60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea"} Apr 16 20:17:45.393240 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.393017 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb" event={"ID":"30322e29-bbd2-41cb-a8c2-146420f13eb1","Type":"ContainerDied","Data":"7c36fa0815b72dbeeaf8a5d66c37895075a833339c7ae8ec65e6a62203eea297"} Apr 16 20:17:45.393240 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.393038 2570 scope.go:117] "RemoveContainer" containerID="60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea" Apr 16 20:17:45.401535 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.401518 2570 scope.go:117] "RemoveContainer" containerID="9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e" Apr 16 20:17:45.408625 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.408610 2570 scope.go:117] "RemoveContainer" containerID="60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea" Apr 16 20:17:45.408859 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:17:45.408840 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea\": container with ID starting with 60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea not found: ID does not exist" containerID="60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea" Apr 16 20:17:45.408937 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.408872 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea"} err="failed to get container status \"60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea\": rpc error: code = NotFound desc = could not find container \"60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea\": container with ID starting with 60657c738b02df1546a8d633c807ec44141400c05379630114a95fc9b28348ea not found: ID does not exist" Apr 16 20:17:45.408937 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.408896 2570 scope.go:117] "RemoveContainer" containerID="9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e" Apr 16 20:17:45.409158 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:17:45.409142 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e\": container with ID starting with 9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e not found: ID does not exist" containerID="9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e" Apr 16 20:17:45.409209 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.409164 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e"} err="failed to get container status \"9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e\": rpc error: code = NotFound desc = could not find container \"9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e\": container with ID starting with 9a1baca76464d8e96f23dfbd6f4ea741cbf7885e960c93ec4f92d2232be5358e not found: ID does not exist" Apr 16 20:17:45.413973 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.413952 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb"] Apr 16 20:17:45.417667 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:45.417648 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-jxngb"] Apr 16 20:17:47.117763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:47.117724 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" path="/var/lib/kubelet/pods/30322e29-bbd2-41cb-a8c2-146420f13eb1/volumes" Apr 16 20:17:47.402155 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:47.402037 2570 generic.go:358] "Generic (PLEG): container finished" podID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerID="e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2" exitCode=0 Apr 16 20:17:47.402155 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:47.402119 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" event={"ID":"02c8119b-7b5d-490c-ab1f-c93f82fae7be","Type":"ContainerDied","Data":"e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2"} Apr 16 20:17:48.407475 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:48.407439 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" event={"ID":"02c8119b-7b5d-490c-ab1f-c93f82fae7be","Type":"ContainerStarted","Data":"81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb"} Apr 16 20:17:48.407881 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:48.407740 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:17:48.409177 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:48.409150 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:17:48.425481 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:48.425434 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" podStartSLOduration=6.425419506 podStartE2EDuration="6.425419506s" podCreationTimestamp="2026-04-16 20:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:17:48.423232701 +0000 UTC m=+1431.958098749" watchObservedRunningTime="2026-04-16 20:17:48.425419506 +0000 UTC m=+1431.960285542" Apr 16 20:17:49.411210 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:49.411167 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:17:59.411764 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:17:59.411717 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:18:09.411492 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:09.411441 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:18:19.411416 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:19.411370 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:18:29.411558 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:29.411512 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:18:39.412328 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:39.412282 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:18:43.985234 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:43.985198 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h"] Apr 16 20:18:43.985664 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:43.985545 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" containerID="cri-o://81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb" gracePeriod=30 Apr 16 20:18:44.095128 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.095091 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4"] Apr 16 20:18:44.095545 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.095531 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="storage-initializer" Apr 16 20:18:44.095616 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.095548 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="storage-initializer" Apr 16 20:18:44.095616 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.095566 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" Apr 16 20:18:44.095616 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.095572 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" Apr 16 20:18:44.095717 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.095636 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="30322e29-bbd2-41cb-a8c2-146420f13eb1" containerName="kserve-container" Apr 16 20:18:44.098950 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.098929 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:18:44.108106 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.108078 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4"] Apr 16 20:18:44.205735 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.205698 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ec204dd-48b0-43b5-82ce-73f3f7a70718-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-vhkm4\" (UID: \"0ec204dd-48b0-43b5-82ce-73f3f7a70718\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:18:44.307028 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.306988 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ec204dd-48b0-43b5-82ce-73f3f7a70718-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-vhkm4\" (UID: \"0ec204dd-48b0-43b5-82ce-73f3f7a70718\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:18:44.307386 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.307363 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ec204dd-48b0-43b5-82ce-73f3f7a70718-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-vhkm4\" (UID: \"0ec204dd-48b0-43b5-82ce-73f3f7a70718\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:18:44.410999 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.410970 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:18:44.539612 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.539577 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4"] Apr 16 20:18:44.542410 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:18:44.542377 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ec204dd_48b0_43b5_82ce_73f3f7a70718.slice/crio-1b859aecb61ac7ae64435b293b8db98051976d0f335d5e805de9d85c348c75fb WatchSource:0}: Error finding container 1b859aecb61ac7ae64435b293b8db98051976d0f335d5e805de9d85c348c75fb: Status 404 returned error can't find the container with id 1b859aecb61ac7ae64435b293b8db98051976d0f335d5e805de9d85c348c75fb Apr 16 20:18:44.544462 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.544444 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:18:44.608872 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.608829 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" event={"ID":"0ec204dd-48b0-43b5-82ce-73f3f7a70718","Type":"ContainerStarted","Data":"3b5edf9c0e80e9e524ad3edcdd99351ae9446ee9b74797c324338c819e66e700"} Apr 16 20:18:44.608872 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:44.608867 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" event={"ID":"0ec204dd-48b0-43b5-82ce-73f3f7a70718","Type":"ContainerStarted","Data":"1b859aecb61ac7ae64435b293b8db98051976d0f335d5e805de9d85c348c75fb"} Apr 16 20:18:46.934438 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:46.934413 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:18:47.029887 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.029858 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02c8119b-7b5d-490c-ab1f-c93f82fae7be-kserve-provision-location\") pod \"02c8119b-7b5d-490c-ab1f-c93f82fae7be\" (UID: \"02c8119b-7b5d-490c-ab1f-c93f82fae7be\") " Apr 16 20:18:47.039538 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.039494 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c8119b-7b5d-490c-ab1f-c93f82fae7be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "02c8119b-7b5d-490c-ab1f-c93f82fae7be" (UID: "02c8119b-7b5d-490c-ab1f-c93f82fae7be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:47.131205 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.131161 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02c8119b-7b5d-490c-ab1f-c93f82fae7be-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:18:47.620324 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.620287 2570 generic.go:358] "Generic (PLEG): container finished" podID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerID="81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb" exitCode=0 Apr 16 20:18:47.620498 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.620360 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" Apr 16 20:18:47.620498 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.620372 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" event={"ID":"02c8119b-7b5d-490c-ab1f-c93f82fae7be","Type":"ContainerDied","Data":"81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb"} Apr 16 20:18:47.620498 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.620423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h" event={"ID":"02c8119b-7b5d-490c-ab1f-c93f82fae7be","Type":"ContainerDied","Data":"c4dc383caea0e393a7e7e2eef44c5053df648816b3eac34509799c5234189a9d"} Apr 16 20:18:47.620498 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.620444 2570 scope.go:117] "RemoveContainer" containerID="81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb" Apr 16 20:18:47.628941 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.628915 2570 scope.go:117] "RemoveContainer" containerID="e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2" Apr 16 20:18:47.636894 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.636874 2570 scope.go:117] "RemoveContainer" containerID="81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb" Apr 16 20:18:47.637160 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:18:47.637143 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb\": container with ID starting with 81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb not found: ID does not exist" containerID="81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb" Apr 16 20:18:47.637254 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.637167 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb"} err="failed to get container status \"81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb\": rpc error: code = NotFound desc = could not find container \"81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb\": container with ID starting with 81717732039efb7eabf786aab349aa1bd35e7f1d86086e1ec58e04dc475891bb not found: ID does not exist" Apr 16 20:18:47.637254 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.637184 2570 scope.go:117] "RemoveContainer" containerID="e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2" Apr 16 20:18:47.637410 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:18:47.637394 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2\": container with ID starting with e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2 not found: ID does not exist" containerID="e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2" Apr 16 20:18:47.637469 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.637412 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2"} err="failed to get container status \"e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2\": rpc error: code = NotFound desc = could not find container \"e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2\": container with ID starting with e1656abbec963396e1035ed02693e4093ad021f9a950403c883cb2b11bb8e1f2 not found: ID does not exist" Apr 16 20:18:47.638679 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.638653 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h"] Apr 16 20:18:47.642275 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:47.642245 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-cgl9h"] Apr 16 20:18:48.626324 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:48.626288 2570 generic.go:358] "Generic (PLEG): container finished" podID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerID="3b5edf9c0e80e9e524ad3edcdd99351ae9446ee9b74797c324338c819e66e700" exitCode=0 Apr 16 20:18:48.626802 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:48.626365 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" event={"ID":"0ec204dd-48b0-43b5-82ce-73f3f7a70718","Type":"ContainerDied","Data":"3b5edf9c0e80e9e524ad3edcdd99351ae9446ee9b74797c324338c819e66e700"} Apr 16 20:18:49.111552 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:49.111470 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" path="/var/lib/kubelet/pods/02c8119b-7b5d-490c-ab1f-c93f82fae7be/volumes" Apr 16 20:18:56.660631 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:56.660592 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" event={"ID":"0ec204dd-48b0-43b5-82ce-73f3f7a70718","Type":"ContainerStarted","Data":"2324df0052f8416b1e624c0b626ef5258a74bcce5b811c76cbd42a5baf479ee1"} Apr 16 20:18:56.661028 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:56.660911 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:18:56.662224 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:56.662197 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:18:56.677250 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:56.677204 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podStartSLOduration=5.566136287 podStartE2EDuration="12.677188634s" podCreationTimestamp="2026-04-16 20:18:44 +0000 UTC" firstStartedPulling="2026-04-16 20:18:48.627650604 +0000 UTC m=+1492.162516618" lastFinishedPulling="2026-04-16 20:18:55.738702948 +0000 UTC m=+1499.273568965" observedRunningTime="2026-04-16 20:18:56.675794327 +0000 UTC m=+1500.210660363" watchObservedRunningTime="2026-04-16 20:18:56.677188634 +0000 UTC m=+1500.212054674" Apr 16 20:18:57.664771 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:18:57.664728 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:19:07.665227 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:19:07.665174 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:19:17.665222 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:19:17.665131 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:19:27.664923 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:19:27.664877 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:19:37.665836 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:19:37.665790 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:19:47.664803 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:19:47.664759 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:19:57.664805 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:19:57.664750 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:20:07.664846 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:07.664797 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:20:17.665979 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:17.665948 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:20:25.293251 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.293207 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4"] Apr 16 20:20:25.293666 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.293594 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" containerID="cri-o://2324df0052f8416b1e624c0b626ef5258a74bcce5b811c76cbd42a5baf479ee1" gracePeriod=30 Apr 16 20:20:25.370174 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.370134 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd"] Apr 16 20:20:25.370527 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.370514 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="storage-initializer" Apr 16 20:20:25.370577 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.370528 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="storage-initializer" Apr 16 20:20:25.370577 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.370553 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" Apr 16 20:20:25.370577 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.370560 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" Apr 16 20:20:25.370670 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.370614 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="02c8119b-7b5d-490c-ab1f-c93f82fae7be" containerName="kserve-container" Apr 16 20:20:25.373712 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.373695 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:20:25.381037 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.381008 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd"] Apr 16 20:20:25.498161 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.498121 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d79b2eba-0947-4200-9757-ea75967c6f42-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-m7zfd\" (UID: \"d79b2eba-0947-4200-9757-ea75967c6f42\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:20:25.599156 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.599069 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d79b2eba-0947-4200-9757-ea75967c6f42-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-m7zfd\" (UID: \"d79b2eba-0947-4200-9757-ea75967c6f42\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:20:25.599444 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.599422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d79b2eba-0947-4200-9757-ea75967c6f42-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-m7zfd\" (UID: \"d79b2eba-0947-4200-9757-ea75967c6f42\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:20:25.685070 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.685009 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:20:25.817503 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.817394 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd"] Apr 16 20:20:25.820132 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:20:25.820101 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd79b2eba_0947_4200_9757_ea75967c6f42.slice/crio-20700279239ee337b78ad0f3699097901ccbcd63b02d85c9088c00ae58d6de75 WatchSource:0}: Error finding container 20700279239ee337b78ad0f3699097901ccbcd63b02d85c9088c00ae58d6de75: Status 404 returned error can't find the container with id 20700279239ee337b78ad0f3699097901ccbcd63b02d85c9088c00ae58d6de75 Apr 16 20:20:25.958151 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.958113 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" event={"ID":"d79b2eba-0947-4200-9757-ea75967c6f42","Type":"ContainerStarted","Data":"1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887"} Apr 16 20:20:25.958151 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:25.958153 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" event={"ID":"d79b2eba-0947-4200-9757-ea75967c6f42","Type":"ContainerStarted","Data":"20700279239ee337b78ad0f3699097901ccbcd63b02d85c9088c00ae58d6de75"} Apr 16 20:20:27.664970 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:27.664923 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 20:20:28.970777 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:28.970742 2570 generic.go:358] "Generic (PLEG): container finished" podID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerID="2324df0052f8416b1e624c0b626ef5258a74bcce5b811c76cbd42a5baf479ee1" exitCode=0 Apr 16 20:20:28.971243 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:28.970836 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" event={"ID":"0ec204dd-48b0-43b5-82ce-73f3f7a70718","Type":"ContainerDied","Data":"2324df0052f8416b1e624c0b626ef5258a74bcce5b811c76cbd42a5baf479ee1"} Apr 16 20:20:29.039596 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.039569 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:20:29.129427 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.129346 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ec204dd-48b0-43b5-82ce-73f3f7a70718-kserve-provision-location\") pod \"0ec204dd-48b0-43b5-82ce-73f3f7a70718\" (UID: \"0ec204dd-48b0-43b5-82ce-73f3f7a70718\") " Apr 16 20:20:29.129666 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.129642 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec204dd-48b0-43b5-82ce-73f3f7a70718-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0ec204dd-48b0-43b5-82ce-73f3f7a70718" (UID: "0ec204dd-48b0-43b5-82ce-73f3f7a70718"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:29.230879 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.230840 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ec204dd-48b0-43b5-82ce-73f3f7a70718-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:20:29.975269 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.975170 2570 generic.go:358] "Generic (PLEG): container finished" podID="d79b2eba-0947-4200-9757-ea75967c6f42" containerID="1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887" exitCode=0 Apr 16 20:20:29.975269 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.975250 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" event={"ID":"d79b2eba-0947-4200-9757-ea75967c6f42","Type":"ContainerDied","Data":"1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887"} Apr 16 20:20:29.977376 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.977354 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" event={"ID":"0ec204dd-48b0-43b5-82ce-73f3f7a70718","Type":"ContainerDied","Data":"1b859aecb61ac7ae64435b293b8db98051976d0f335d5e805de9d85c348c75fb"} Apr 16 20:20:29.977475 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.977392 2570 scope.go:117] "RemoveContainer" containerID="2324df0052f8416b1e624c0b626ef5258a74bcce5b811c76cbd42a5baf479ee1" Apr 16 20:20:29.977475 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.977407 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4" Apr 16 20:20:29.985795 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:29.985776 2570 scope.go:117] "RemoveContainer" containerID="3b5edf9c0e80e9e524ad3edcdd99351ae9446ee9b74797c324338c819e66e700" Apr 16 20:20:30.006152 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:30.006126 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4"] Apr 16 20:20:30.009743 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:30.009721 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-vhkm4"] Apr 16 20:20:30.983753 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:30.983718 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" event={"ID":"d79b2eba-0947-4200-9757-ea75967c6f42","Type":"ContainerStarted","Data":"21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5"} Apr 16 20:20:30.984288 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:30.984116 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:20:30.985462 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:30.985434 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:20:31.000676 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:31.000627 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podStartSLOduration=6.000611915 podStartE2EDuration="6.000611915s" podCreationTimestamp="2026-04-16 20:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:30.999151996 +0000 UTC m=+1594.534018033" watchObservedRunningTime="2026-04-16 20:20:31.000611915 +0000 UTC m=+1594.535477952" Apr 16 20:20:31.111318 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:31.111280 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" path="/var/lib/kubelet/pods/0ec204dd-48b0-43b5-82ce-73f3f7a70718/volumes" Apr 16 20:20:31.987449 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:31.987410 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:20:41.987878 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:41.987833 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:20:51.987512 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:20:51.987417 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:21:01.988330 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:01.988285 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:21:11.987584 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:11.987538 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:21:21.988075 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:21.988002 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:21:31.988277 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:31.988236 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:21:33.108042 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:33.107998 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:21:43.108362 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:43.108320 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 20:21:53.112037 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:53.112010 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:21:56.388066 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.388011 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd"] Apr 16 20:21:56.388464 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.388414 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" containerID="cri-o://21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5" gracePeriod=30 Apr 16 20:21:56.475286 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.475248 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2"] Apr 16 20:21:56.475681 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.475654 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" Apr 16 20:21:56.475681 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.475672 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" Apr 16 20:21:56.475918 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.475688 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="storage-initializer" Apr 16 20:21:56.475918 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.475696 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="storage-initializer" Apr 16 20:21:56.475918 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.475785 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ec204dd-48b0-43b5-82ce-73f3f7a70718" containerName="kserve-container" Apr 16 20:21:56.479168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.479147 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:21:56.486449 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.486422 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2"] Apr 16 20:21:56.616371 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.616327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96dee0e8-19a6-4ebf-9ff3-e2b70d132448-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2\" (UID: \"96dee0e8-19a6-4ebf-9ff3-e2b70d132448\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:21:56.717610 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.717501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96dee0e8-19a6-4ebf-9ff3-e2b70d132448-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2\" (UID: \"96dee0e8-19a6-4ebf-9ff3-e2b70d132448\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:21:56.717895 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.717874 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96dee0e8-19a6-4ebf-9ff3-e2b70d132448-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2\" (UID: \"96dee0e8-19a6-4ebf-9ff3-e2b70d132448\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:21:56.791736 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.791696 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:21:56.912989 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:56.912958 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2"] Apr 16 20:21:56.915347 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:21:56.915307 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96dee0e8_19a6_4ebf_9ff3_e2b70d132448.slice/crio-c296ea5c7e09901228f8aa9ee4a4762447b338723c581f8a9adba52dcb7e5eba WatchSource:0}: Error finding container c296ea5c7e09901228f8aa9ee4a4762447b338723c581f8a9adba52dcb7e5eba: Status 404 returned error can't find the container with id c296ea5c7e09901228f8aa9ee4a4762447b338723c581f8a9adba52dcb7e5eba Apr 16 20:21:57.280843 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:57.280805 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" event={"ID":"96dee0e8-19a6-4ebf-9ff3-e2b70d132448","Type":"ContainerStarted","Data":"66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf"} Apr 16 20:21:57.281011 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:21:57.280851 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" event={"ID":"96dee0e8-19a6-4ebf-9ff3-e2b70d132448","Type":"ContainerStarted","Data":"c296ea5c7e09901228f8aa9ee4a4762447b338723c581f8a9adba52dcb7e5eba"} Apr 16 20:22:00.134523 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.134501 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:22:00.250992 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.250953 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d79b2eba-0947-4200-9757-ea75967c6f42-kserve-provision-location\") pod \"d79b2eba-0947-4200-9757-ea75967c6f42\" (UID: \"d79b2eba-0947-4200-9757-ea75967c6f42\") " Apr 16 20:22:00.251262 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.251239 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79b2eba-0947-4200-9757-ea75967c6f42-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d79b2eba-0947-4200-9757-ea75967c6f42" (UID: "d79b2eba-0947-4200-9757-ea75967c6f42"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:22:00.293829 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.293797 2570 generic.go:358] "Generic (PLEG): container finished" podID="d79b2eba-0947-4200-9757-ea75967c6f42" containerID="21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5" exitCode=0 Apr 16 20:22:00.294011 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.293871 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" Apr 16 20:22:00.294011 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.293865 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" event={"ID":"d79b2eba-0947-4200-9757-ea75967c6f42","Type":"ContainerDied","Data":"21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5"} Apr 16 20:22:00.294011 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.293912 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd" event={"ID":"d79b2eba-0947-4200-9757-ea75967c6f42","Type":"ContainerDied","Data":"20700279239ee337b78ad0f3699097901ccbcd63b02d85c9088c00ae58d6de75"} Apr 16 20:22:00.294011 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.293928 2570 scope.go:117] "RemoveContainer" containerID="21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5" Apr 16 20:22:00.303117 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.303099 2570 scope.go:117] "RemoveContainer" containerID="1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887" Apr 16 20:22:00.310594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.310572 2570 scope.go:117] "RemoveContainer" containerID="21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5" Apr 16 20:22:00.310864 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:22:00.310846 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5\": container with ID starting with 21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5 not found: ID does not exist" containerID="21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5" Apr 16 20:22:00.310906 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.310870 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5"} err="failed to get container status \"21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5\": rpc error: code = NotFound desc = could not find container \"21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5\": container with ID starting with 21b8d1c8a8f9ef57e0be597dbf022648bf0e0cc2f6eafd3c448eaac4f4aabac5 not found: ID does not exist" Apr 16 20:22:00.310906 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.310887 2570 scope.go:117] "RemoveContainer" containerID="1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887" Apr 16 20:22:00.311161 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:22:00.311141 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887\": container with ID starting with 1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887 not found: ID does not exist" containerID="1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887" Apr 16 20:22:00.311207 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.311169 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887"} err="failed to get container status \"1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887\": rpc error: code = NotFound desc = could not find container \"1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887\": container with ID starting with 1498242b0090de34cc939ef03049977c1074562d7f661ce6d50cb8a8d066e887 not found: ID does not exist" Apr 16 20:22:00.319014 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.318990 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd"] Apr 16 20:22:00.325307 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.325285 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-m7zfd"] Apr 16 20:22:00.351933 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:00.351887 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d79b2eba-0947-4200-9757-ea75967c6f42-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:22:01.111858 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:01.111818 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" path="/var/lib/kubelet/pods/d79b2eba-0947-4200-9757-ea75967c6f42/volumes" Apr 16 20:22:01.300216 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:01.300183 2570 generic.go:358] "Generic (PLEG): container finished" podID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerID="66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf" exitCode=0 Apr 16 20:22:01.300618 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:01.300255 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" event={"ID":"96dee0e8-19a6-4ebf-9ff3-e2b70d132448","Type":"ContainerDied","Data":"66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf"} Apr 16 20:22:02.305445 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:02.305411 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" event={"ID":"96dee0e8-19a6-4ebf-9ff3-e2b70d132448","Type":"ContainerStarted","Data":"ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879"} Apr 16 20:22:02.305947 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:02.305755 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:22:02.307103 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:02.307051 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:22:02.327399 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:02.327347 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podStartSLOduration=6.327331888 podStartE2EDuration="6.327331888s" podCreationTimestamp="2026-04-16 20:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:22:02.324789445 +0000 UTC m=+1685.859655481" watchObservedRunningTime="2026-04-16 20:22:02.327331888 +0000 UTC m=+1685.862197923" Apr 16 20:22:03.309576 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:03.309530 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:22:13.309923 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:13.309872 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:22:23.310254 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:23.310145 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:22:33.310396 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:33.310344 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:22:43.309947 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:43.309896 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:22:53.309624 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:22:53.309569 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:23:03.309896 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:03.309842 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:23:09.107396 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:09.107351 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:23:19.107419 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:19.107363 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:23:29.112064 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:29.112024 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:23:37.521146 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.521107 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2"] Apr 16 20:23:37.521581 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.521409 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" containerID="cri-o://ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879" gracePeriod=30 Apr 16 20:23:37.596083 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.596022 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh"] Apr 16 20:23:37.596520 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.596500 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" Apr 16 20:23:37.596520 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.596522 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" Apr 16 20:23:37.596685 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.596554 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="storage-initializer" Apr 16 20:23:37.596685 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.596562 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="storage-initializer" Apr 16 20:23:37.596685 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.596657 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d79b2eba-0947-4200-9757-ea75967c6f42" containerName="kserve-container" Apr 16 20:23:37.599831 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.599810 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:23:37.607194 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.607025 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh"] Apr 16 20:23:37.694924 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.694882 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e0df88b-d63c-4cd4-9dbc-dc3892bb180c-kserve-provision-location\") pod \"isvc-primary-db5133-predictor-5d499d5b4f-2qqlh\" (UID: \"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c\") " pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:23:37.796119 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.795990 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e0df88b-d63c-4cd4-9dbc-dc3892bb180c-kserve-provision-location\") pod \"isvc-primary-db5133-predictor-5d499d5b4f-2qqlh\" (UID: \"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c\") " pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:23:37.796418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.796395 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e0df88b-d63c-4cd4-9dbc-dc3892bb180c-kserve-provision-location\") pod \"isvc-primary-db5133-predictor-5d499d5b4f-2qqlh\" (UID: \"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c\") " pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:23:37.911453 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:37.911417 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:23:38.043817 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:38.043729 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh"] Apr 16 20:23:38.046309 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:23:38.046246 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e0df88b_d63c_4cd4_9dbc_dc3892bb180c.slice/crio-aecb734c28b5939ec7bdfeee902782d7387a57b15961c30676176fa87efb3216 WatchSource:0}: Error finding container aecb734c28b5939ec7bdfeee902782d7387a57b15961c30676176fa87efb3216: Status 404 returned error can't find the container with id aecb734c28b5939ec7bdfeee902782d7387a57b15961c30676176fa87efb3216 Apr 16 20:23:38.646884 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:38.646846 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" event={"ID":"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c","Type":"ContainerStarted","Data":"255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1"} Apr 16 20:23:38.646884 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:38.646884 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" event={"ID":"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c","Type":"ContainerStarted","Data":"aecb734c28b5939ec7bdfeee902782d7387a57b15961c30676176fa87efb3216"} Apr 16 20:23:39.107416 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:39.107366 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 20:23:41.273783 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.273754 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:23:41.429321 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.429242 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96dee0e8-19a6-4ebf-9ff3-e2b70d132448-kserve-provision-location\") pod \"96dee0e8-19a6-4ebf-9ff3-e2b70d132448\" (UID: \"96dee0e8-19a6-4ebf-9ff3-e2b70d132448\") " Apr 16 20:23:41.429632 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.429607 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96dee0e8-19a6-4ebf-9ff3-e2b70d132448-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "96dee0e8-19a6-4ebf-9ff3-e2b70d132448" (UID: "96dee0e8-19a6-4ebf-9ff3-e2b70d132448"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:41.530277 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.530244 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96dee0e8-19a6-4ebf-9ff3-e2b70d132448-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:23:41.659253 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.659220 2570 generic.go:358] "Generic (PLEG): container finished" podID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerID="ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879" exitCode=0 Apr 16 20:23:41.659434 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.659276 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" event={"ID":"96dee0e8-19a6-4ebf-9ff3-e2b70d132448","Type":"ContainerDied","Data":"ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879"} Apr 16 20:23:41.659434 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.659292 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" Apr 16 20:23:41.659434 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.659302 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2" event={"ID":"96dee0e8-19a6-4ebf-9ff3-e2b70d132448","Type":"ContainerDied","Data":"c296ea5c7e09901228f8aa9ee4a4762447b338723c581f8a9adba52dcb7e5eba"} Apr 16 20:23:41.659434 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.659318 2570 scope.go:117] "RemoveContainer" containerID="ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879" Apr 16 20:23:41.668290 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.668253 2570 scope.go:117] "RemoveContainer" containerID="66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf" Apr 16 20:23:41.676359 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.676334 2570 scope.go:117] "RemoveContainer" containerID="ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879" Apr 16 20:23:41.676622 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:23:41.676601 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879\": container with ID starting with ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879 not found: ID does not exist" containerID="ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879" Apr 16 20:23:41.676674 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.676631 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879"} err="failed to get container status \"ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879\": rpc error: code = NotFound desc = could not find container \"ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879\": container with ID starting with ad84b6068c40038670a9a367434b019ce3f8b6b2b86d779a9294b29e40dd3879 not found: ID does not exist" Apr 16 20:23:41.676674 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.676648 2570 scope.go:117] "RemoveContainer" containerID="66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf" Apr 16 20:23:41.676881 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:23:41.676862 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf\": container with ID starting with 66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf not found: ID does not exist" containerID="66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf" Apr 16 20:23:41.676932 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.676890 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf"} err="failed to get container status \"66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf\": rpc error: code = NotFound desc = could not find container \"66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf\": container with ID starting with 66f0c9d795adf0d42f15b9ee5646d13dd202c42dc5da2f0467c20a91ea5839cf not found: ID does not exist" Apr 16 20:23:41.680982 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.680927 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2"] Apr 16 20:23:41.684660 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:41.684639 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-r42g2"] Apr 16 20:23:42.666682 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:42.666649 2570 generic.go:358] "Generic (PLEG): container finished" podID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerID="255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1" exitCode=0 Apr 16 20:23:42.667142 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:42.666721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" event={"ID":"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c","Type":"ContainerDied","Data":"255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1"} Apr 16 20:23:43.111897 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:43.111858 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" path="/var/lib/kubelet/pods/96dee0e8-19a6-4ebf-9ff3-e2b70d132448/volumes" Apr 16 20:23:43.672142 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:43.672110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" event={"ID":"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c","Type":"ContainerStarted","Data":"470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd"} Apr 16 20:23:43.672581 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:43.672392 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:23:43.673752 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:43.673725 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:23:43.689446 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:43.689403 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podStartSLOduration=6.689388371 podStartE2EDuration="6.689388371s" podCreationTimestamp="2026-04-16 20:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:23:43.687885389 +0000 UTC m=+1787.222751465" watchObservedRunningTime="2026-04-16 20:23:43.689388371 +0000 UTC m=+1787.224254403" Apr 16 20:23:44.676071 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:44.676027 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:23:54.676365 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:23:54.676315 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:24:04.676346 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:04.676301 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:24:14.676165 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:14.676118 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:24:24.676430 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:24.676386 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:24:34.676488 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:34.676431 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:24:44.676287 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:44.676235 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:24:47.109331 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:47.109291 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:24:57.111939 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.111891 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:24:57.730105 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.730050 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4"] Apr 16 20:24:57.730494 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.730480 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="storage-initializer" Apr 16 20:24:57.730542 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.730495 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="storage-initializer" Apr 16 20:24:57.730542 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.730506 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" Apr 16 20:24:57.730542 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.730512 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" Apr 16 20:24:57.730645 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.730573 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="96dee0e8-19a6-4ebf-9ff3-e2b70d132448" containerName="kserve-container" Apr 16 20:24:57.733766 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.733749 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:24:57.736272 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.736234 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-db5133\"" Apr 16 20:24:57.736415 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.736356 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 20:24:57.737407 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.737391 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-db5133-dockercfg-jkt4n\"" Apr 16 20:24:57.741465 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.741200 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4"] Apr 16 20:24:57.778978 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.778932 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/54c960e7-7676-48ae-bceb-56b590f7b19e-cabundle-cert\") pod \"isvc-secondary-db5133-predictor-5d7bd9896-zxpq4\" (UID: \"54c960e7-7676-48ae-bceb-56b590f7b19e\") " pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:24:57.779190 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.779001 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c960e7-7676-48ae-bceb-56b590f7b19e-kserve-provision-location\") pod \"isvc-secondary-db5133-predictor-5d7bd9896-zxpq4\" (UID: \"54c960e7-7676-48ae-bceb-56b590f7b19e\") " pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:24:57.880249 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.880206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/54c960e7-7676-48ae-bceb-56b590f7b19e-cabundle-cert\") pod \"isvc-secondary-db5133-predictor-5d7bd9896-zxpq4\" (UID: \"54c960e7-7676-48ae-bceb-56b590f7b19e\") " pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:24:57.880434 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.880265 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c960e7-7676-48ae-bceb-56b590f7b19e-kserve-provision-location\") pod \"isvc-secondary-db5133-predictor-5d7bd9896-zxpq4\" (UID: \"54c960e7-7676-48ae-bceb-56b590f7b19e\") " pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:24:57.880663 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.880642 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c960e7-7676-48ae-bceb-56b590f7b19e-kserve-provision-location\") pod \"isvc-secondary-db5133-predictor-5d7bd9896-zxpq4\" (UID: \"54c960e7-7676-48ae-bceb-56b590f7b19e\") " pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:24:57.880819 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:57.880803 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/54c960e7-7676-48ae-bceb-56b590f7b19e-cabundle-cert\") pod \"isvc-secondary-db5133-predictor-5d7bd9896-zxpq4\" (UID: \"54c960e7-7676-48ae-bceb-56b590f7b19e\") " pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:24:58.045091 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:58.045037 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:24:58.169993 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:58.169963 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4"] Apr 16 20:24:58.172786 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:24:58.172754 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54c960e7_7676_48ae_bceb_56b590f7b19e.slice/crio-71c4d2e69a39f60b9159603927b3a1840aa89cf60ff80dd3164664a414d1d936 WatchSource:0}: Error finding container 71c4d2e69a39f60b9159603927b3a1840aa89cf60ff80dd3164664a414d1d936: Status 404 returned error can't find the container with id 71c4d2e69a39f60b9159603927b3a1840aa89cf60ff80dd3164664a414d1d936 Apr 16 20:24:58.174628 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:58.174606 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:24:58.931624 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:58.931587 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" event={"ID":"54c960e7-7676-48ae-bceb-56b590f7b19e","Type":"ContainerStarted","Data":"ccc70df8a697be702c5b32c54ebe459141ba4c84cf4409f7eec5497d13f4e747"} Apr 16 20:24:58.931624 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:24:58.931628 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" event={"ID":"54c960e7-7676-48ae-bceb-56b590f7b19e","Type":"ContainerStarted","Data":"71c4d2e69a39f60b9159603927b3a1840aa89cf60ff80dd3164664a414d1d936"} Apr 16 20:25:04.955511 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:04.955430 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_54c960e7-7676-48ae-bceb-56b590f7b19e/storage-initializer/0.log" Apr 16 20:25:04.955511 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:04.955468 2570 generic.go:358] "Generic (PLEG): container finished" podID="54c960e7-7676-48ae-bceb-56b590f7b19e" containerID="ccc70df8a697be702c5b32c54ebe459141ba4c84cf4409f7eec5497d13f4e747" exitCode=1 Apr 16 20:25:04.955941 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:04.955548 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" event={"ID":"54c960e7-7676-48ae-bceb-56b590f7b19e","Type":"ContainerDied","Data":"ccc70df8a697be702c5b32c54ebe459141ba4c84cf4409f7eec5497d13f4e747"} Apr 16 20:25:05.960427 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:05.960395 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_54c960e7-7676-48ae-bceb-56b590f7b19e/storage-initializer/0.log" Apr 16 20:25:05.960828 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:05.960493 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" event={"ID":"54c960e7-7676-48ae-bceb-56b590f7b19e","Type":"ContainerStarted","Data":"641d114ec542a2f1ce921fcd5475a5cd53c36968b1699edab3ce3aee0a8ad08a"} Apr 16 20:25:10.979488 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:10.979403 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_54c960e7-7676-48ae-bceb-56b590f7b19e/storage-initializer/1.log" Apr 16 20:25:10.979933 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:10.979793 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_54c960e7-7676-48ae-bceb-56b590f7b19e/storage-initializer/0.log" Apr 16 20:25:10.979933 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:10.979827 2570 generic.go:358] "Generic (PLEG): container finished" podID="54c960e7-7676-48ae-bceb-56b590f7b19e" containerID="641d114ec542a2f1ce921fcd5475a5cd53c36968b1699edab3ce3aee0a8ad08a" exitCode=1 Apr 16 20:25:10.979933 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:10.979904 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" event={"ID":"54c960e7-7676-48ae-bceb-56b590f7b19e","Type":"ContainerDied","Data":"641d114ec542a2f1ce921fcd5475a5cd53c36968b1699edab3ce3aee0a8ad08a"} Apr 16 20:25:10.980043 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:10.979957 2570 scope.go:117] "RemoveContainer" containerID="ccc70df8a697be702c5b32c54ebe459141ba4c84cf4409f7eec5497d13f4e747" Apr 16 20:25:10.980334 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:10.980321 2570 scope.go:117] "RemoveContainer" containerID="ccc70df8a697be702c5b32c54ebe459141ba4c84cf4409f7eec5497d13f4e747" Apr 16 20:25:10.990610 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:25:10.990576 2570 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_kserve-ci-e2e-test_54c960e7-7676-48ae-bceb-56b590f7b19e_0 in pod sandbox 71c4d2e69a39f60b9159603927b3a1840aa89cf60ff80dd3164664a414d1d936 from index: no such id: 'ccc70df8a697be702c5b32c54ebe459141ba4c84cf4409f7eec5497d13f4e747'" containerID="ccc70df8a697be702c5b32c54ebe459141ba4c84cf4409f7eec5497d13f4e747" Apr 16 20:25:10.990683 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:25:10.990632 2570 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_kserve-ci-e2e-test_54c960e7-7676-48ae-bceb-56b590f7b19e_0 in pod sandbox 71c4d2e69a39f60b9159603927b3a1840aa89cf60ff80dd3164664a414d1d936 from index: no such id: 'ccc70df8a697be702c5b32c54ebe459141ba4c84cf4409f7eec5497d13f4e747'; Skipping pod \"isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_kserve-ci-e2e-test(54c960e7-7676-48ae-bceb-56b590f7b19e)\"" logger="UnhandledError" Apr 16 20:25:10.991951 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:25:10.991931 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_kserve-ci-e2e-test(54c960e7-7676-48ae-bceb-56b590f7b19e)\"" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" podUID="54c960e7-7676-48ae-bceb-56b590f7b19e" Apr 16 20:25:11.985186 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:11.985158 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_54c960e7-7676-48ae-bceb-56b590f7b19e/storage-initializer/1.log" Apr 16 20:25:15.821893 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.821806 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4"] Apr 16 20:25:15.868817 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.868784 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh"] Apr 16 20:25:15.869132 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.869051 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" containerID="cri-o://470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd" gracePeriod=30 Apr 16 20:25:15.922065 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.922017 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx"] Apr 16 20:25:15.926893 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.926869 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:15.930357 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.930330 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-8bbbb8-dockercfg-8fc42\"" Apr 16 20:25:15.930508 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.930326 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-8bbbb8\"" Apr 16 20:25:15.934200 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.934178 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx"] Apr 16 20:25:15.970539 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.970514 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_54c960e7-7676-48ae-bceb-56b590f7b19e/storage-initializer/1.log" Apr 16 20:25:15.970724 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.970586 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:25:15.999257 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.999226 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-db5133-predictor-5d7bd9896-zxpq4_54c960e7-7676-48ae-bceb-56b590f7b19e/storage-initializer/1.log" Apr 16 20:25:15.999434 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.999338 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" Apr 16 20:25:15.999434 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.999375 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4" event={"ID":"54c960e7-7676-48ae-bceb-56b590f7b19e","Type":"ContainerDied","Data":"71c4d2e69a39f60b9159603927b3a1840aa89cf60ff80dd3164664a414d1d936"} Apr 16 20:25:15.999434 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:15.999428 2570 scope.go:117] "RemoveContainer" containerID="641d114ec542a2f1ce921fcd5475a5cd53c36968b1699edab3ce3aee0a8ad08a" Apr 16 20:25:16.035574 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.035533 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/54c960e7-7676-48ae-bceb-56b590f7b19e-cabundle-cert\") pod \"54c960e7-7676-48ae-bceb-56b590f7b19e\" (UID: \"54c960e7-7676-48ae-bceb-56b590f7b19e\") " Apr 16 20:25:16.035718 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.035627 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c960e7-7676-48ae-bceb-56b590f7b19e-kserve-provision-location\") pod \"54c960e7-7676-48ae-bceb-56b590f7b19e\" (UID: \"54c960e7-7676-48ae-bceb-56b590f7b19e\") " Apr 16 20:25:16.035912 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.035869 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c960e7-7676-48ae-bceb-56b590f7b19e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "54c960e7-7676-48ae-bceb-56b590f7b19e" (UID: "54c960e7-7676-48ae-bceb-56b590f7b19e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:16.035912 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.035881 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c960e7-7676-48ae-bceb-56b590f7b19e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "54c960e7-7676-48ae-bceb-56b590f7b19e" (UID: "54c960e7-7676-48ae-bceb-56b590f7b19e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:25:16.035912 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.035905 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-kserve-provision-location\") pod \"isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx\" (UID: \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\") " pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:16.036173 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.035947 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-cabundle-cert\") pod \"isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx\" (UID: \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\") " pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:16.036173 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.036095 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/54c960e7-7676-48ae-bceb-56b590f7b19e-cabundle-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:25:16.036173 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.036119 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c960e7-7676-48ae-bceb-56b590f7b19e-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:25:16.137408 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.137311 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-kserve-provision-location\") pod \"isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx\" (UID: \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\") " pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:16.137408 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.137354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-cabundle-cert\") pod \"isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx\" (UID: \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\") " pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:16.137720 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.137695 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-kserve-provision-location\") pod \"isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx\" (UID: \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\") " pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:16.138003 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.137985 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-cabundle-cert\") pod \"isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx\" (UID: \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\") " pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:16.238800 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.238770 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:16.342289 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.342111 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4"] Apr 16 20:25:16.344012 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.343986 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-db5133-predictor-5d7bd9896-zxpq4"] Apr 16 20:25:16.371663 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:16.371636 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx"] Apr 16 20:25:16.374390 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:25:16.374356 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbed6218e_4fa8_4d4b_a64a_39cc33b6fff9.slice/crio-567580de7f2703f27586655add0bde235443b440256e6ce7ad5bc2a00286f2de WatchSource:0}: Error finding container 567580de7f2703f27586655add0bde235443b440256e6ce7ad5bc2a00286f2de: Status 404 returned error can't find the container with id 567580de7f2703f27586655add0bde235443b440256e6ce7ad5bc2a00286f2de Apr 16 20:25:17.004235 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:17.004189 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" event={"ID":"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9","Type":"ContainerStarted","Data":"c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf"} Apr 16 20:25:17.004693 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:17.004242 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" event={"ID":"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9","Type":"ContainerStarted","Data":"567580de7f2703f27586655add0bde235443b440256e6ce7ad5bc2a00286f2de"} Apr 16 20:25:17.109868 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:17.109823 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 20:25:17.112437 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:17.112406 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c960e7-7676-48ae-bceb-56b590f7b19e" path="/var/lib/kubelet/pods/54c960e7-7676-48ae-bceb-56b590f7b19e/volumes" Apr 16 20:25:19.013948 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:19.013919 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx_bed6218e-4fa8-4d4b-a64a-39cc33b6fff9/storage-initializer/0.log" Apr 16 20:25:19.014342 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:19.013958 2570 generic.go:358] "Generic (PLEG): container finished" podID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerID="c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf" exitCode=1 Apr 16 20:25:19.014342 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:19.014009 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" event={"ID":"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9","Type":"ContainerDied","Data":"c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf"} Apr 16 20:25:20.018549 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:20.018521 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx_bed6218e-4fa8-4d4b-a64a-39cc33b6fff9/storage-initializer/0.log" Apr 16 20:25:20.018935 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:20.018626 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" event={"ID":"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9","Type":"ContainerStarted","Data":"7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de"} Apr 16 20:25:20.321506 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:20.321481 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:25:20.378346 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:20.378303 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e0df88b-d63c-4cd4-9dbc-dc3892bb180c-kserve-provision-location\") pod \"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c\" (UID: \"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c\") " Apr 16 20:25:20.378645 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:20.378618 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0df88b-d63c-4cd4-9dbc-dc3892bb180c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" (UID: "0e0df88b-d63c-4cd4-9dbc-dc3892bb180c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:20.479953 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:20.479914 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e0df88b-d63c-4cd4-9dbc-dc3892bb180c-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:25:20.939525 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:20.939492 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx"] Apr 16 20:25:21.023841 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.023808 2570 generic.go:358] "Generic (PLEG): container finished" podID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerID="470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd" exitCode=0 Apr 16 20:25:21.024421 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.023881 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" Apr 16 20:25:21.024421 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.023893 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" event={"ID":"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c","Type":"ContainerDied","Data":"470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd"} Apr 16 20:25:21.024421 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.023943 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh" event={"ID":"0e0df88b-d63c-4cd4-9dbc-dc3892bb180c","Type":"ContainerDied","Data":"aecb734c28b5939ec7bdfeee902782d7387a57b15961c30676176fa87efb3216"} Apr 16 20:25:21.024421 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.023961 2570 scope.go:117] "RemoveContainer" containerID="470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd" Apr 16 20:25:21.024421 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.024233 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" podUID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerName="storage-initializer" containerID="cri-o://7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de" gracePeriod=30 Apr 16 20:25:21.029495 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029469 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr"] Apr 16 20:25:21.029879 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029862 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" Apr 16 20:25:21.029879 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029880 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" Apr 16 20:25:21.030029 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029892 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54c960e7-7676-48ae-bceb-56b590f7b19e" containerName="storage-initializer" Apr 16 20:25:21.030029 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029898 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c960e7-7676-48ae-bceb-56b590f7b19e" containerName="storage-initializer" Apr 16 20:25:21.030029 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029908 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54c960e7-7676-48ae-bceb-56b590f7b19e" containerName="storage-initializer" Apr 16 20:25:21.030029 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029917 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c960e7-7676-48ae-bceb-56b590f7b19e" containerName="storage-initializer" Apr 16 20:25:21.030029 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029933 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="storage-initializer" Apr 16 20:25:21.030029 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.029943 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="storage-initializer" Apr 16 20:25:21.030029 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.030010 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="54c960e7-7676-48ae-bceb-56b590f7b19e" containerName="storage-initializer" Apr 16 20:25:21.030029 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.030021 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" containerName="kserve-container" Apr 16 20:25:21.030516 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.030145 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="54c960e7-7676-48ae-bceb-56b590f7b19e" containerName="storage-initializer" Apr 16 20:25:21.037491 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.037470 2570 scope.go:117] "RemoveContainer" containerID="255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1" Apr 16 20:25:21.043038 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.043014 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:25:21.043262 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.043241 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr"] Apr 16 20:25:21.045921 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.045721 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t49kf\"" Apr 16 20:25:21.050095 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.050050 2570 scope.go:117] "RemoveContainer" containerID="470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd" Apr 16 20:25:21.050429 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:25:21.050406 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd\": container with ID starting with 470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd not found: ID does not exist" containerID="470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd" Apr 16 20:25:21.050514 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.050436 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd"} err="failed to get container status \"470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd\": rpc error: code = NotFound desc = could not find container \"470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd\": container with ID starting with 470007add3276b33346dcd1930b67026a41f027259b50d4bab56aa6da476eedd not found: ID does not exist" Apr 16 20:25:21.050514 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.050461 2570 scope.go:117] "RemoveContainer" containerID="255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1" Apr 16 20:25:21.050762 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:25:21.050736 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1\": container with ID starting with 255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1 not found: ID does not exist" containerID="255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1" Apr 16 20:25:21.050918 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.050766 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1"} err="failed to get container status \"255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1\": rpc error: code = NotFound desc = could not find container \"255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1\": container with ID starting with 255082b80bca54ae29509bfcfa48459a2265264e2e5a296542de72196c3953e1 not found: ID does not exist" Apr 16 20:25:21.052812 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.052627 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh"] Apr 16 20:25:21.054920 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.054898 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-db5133-predictor-5d499d5b4f-2qqlh"] Apr 16 20:25:21.111587 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.111554 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0df88b-d63c-4cd4-9dbc-dc3892bb180c" path="/var/lib/kubelet/pods/0e0df88b-d63c-4cd4-9dbc-dc3892bb180c/volumes" Apr 16 20:25:21.186746 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.186700 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cec59a21-153e-4cc9-b5ce-dbada8f3eda5-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-5xnnr\" (UID: \"cec59a21-153e-4cc9-b5ce-dbada8f3eda5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:25:21.287355 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.287295 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cec59a21-153e-4cc9-b5ce-dbada8f3eda5-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-5xnnr\" (UID: \"cec59a21-153e-4cc9-b5ce-dbada8f3eda5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:25:21.287731 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.287707 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cec59a21-153e-4cc9-b5ce-dbada8f3eda5-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-5xnnr\" (UID: \"cec59a21-153e-4cc9-b5ce-dbada8f3eda5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:25:21.356442 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.356410 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:25:21.478716 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:21.478690 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr"] Apr 16 20:25:21.481450 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:25:21.481418 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcec59a21_153e_4cc9_b5ce_dbada8f3eda5.slice/crio-87349d7ba9bdc2019150647eb35b9c45f533c32c873fd5d5663100f965f0b182 WatchSource:0}: Error finding container 87349d7ba9bdc2019150647eb35b9c45f533c32c873fd5d5663100f965f0b182: Status 404 returned error can't find the container with id 87349d7ba9bdc2019150647eb35b9c45f533c32c873fd5d5663100f965f0b182 Apr 16 20:25:22.030085 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.030028 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" event={"ID":"cec59a21-153e-4cc9-b5ce-dbada8f3eda5","Type":"ContainerStarted","Data":"d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5"} Apr 16 20:25:22.030085 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.030085 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" event={"ID":"cec59a21-153e-4cc9-b5ce-dbada8f3eda5","Type":"ContainerStarted","Data":"87349d7ba9bdc2019150647eb35b9c45f533c32c873fd5d5663100f965f0b182"} Apr 16 20:25:22.674413 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.674386 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx_bed6218e-4fa8-4d4b-a64a-39cc33b6fff9/storage-initializer/1.log" Apr 16 20:25:22.674772 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.674757 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx_bed6218e-4fa8-4d4b-a64a-39cc33b6fff9/storage-initializer/0.log" Apr 16 20:25:22.674827 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.674819 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:22.800428 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.800395 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-kserve-provision-location\") pod \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\" (UID: \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\") " Apr 16 20:25:22.800606 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.800473 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-cabundle-cert\") pod \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\" (UID: \"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9\") " Apr 16 20:25:22.800690 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.800662 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" (UID: "bed6218e-4fa8-4d4b-a64a-39cc33b6fff9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:22.800817 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.800796 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" (UID: "bed6218e-4fa8-4d4b-a64a-39cc33b6fff9"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:25:22.901333 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.901298 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-cabundle-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:25:22.901333 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:22.901327 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:25:23.034435 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.034408 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx_bed6218e-4fa8-4d4b-a64a-39cc33b6fff9/storage-initializer/1.log" Apr 16 20:25:23.034785 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.034772 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx_bed6218e-4fa8-4d4b-a64a-39cc33b6fff9/storage-initializer/0.log" Apr 16 20:25:23.034843 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.034810 2570 generic.go:358] "Generic (PLEG): container finished" podID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerID="7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de" exitCode=1 Apr 16 20:25:23.034913 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.034892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" event={"ID":"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9","Type":"ContainerDied","Data":"7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de"} Apr 16 20:25:23.034952 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.034928 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" event={"ID":"bed6218e-4fa8-4d4b-a64a-39cc33b6fff9","Type":"ContainerDied","Data":"567580de7f2703f27586655add0bde235443b440256e6ce7ad5bc2a00286f2de"} Apr 16 20:25:23.034952 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.034944 2570 scope.go:117] "RemoveContainer" containerID="7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de" Apr 16 20:25:23.035031 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.034896 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx" Apr 16 20:25:23.043555 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.043536 2570 scope.go:117] "RemoveContainer" containerID="c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf" Apr 16 20:25:23.050910 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.050895 2570 scope.go:117] "RemoveContainer" containerID="7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de" Apr 16 20:25:23.051184 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:25:23.051164 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de\": container with ID starting with 7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de not found: ID does not exist" containerID="7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de" Apr 16 20:25:23.051236 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.051195 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de"} err="failed to get container status \"7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de\": rpc error: code = NotFound desc = could not find container \"7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de\": container with ID starting with 7dd231f4111da2a4b8d918ed873fe2214530fc2c6c41384ee40bde7b451c38de not found: ID does not exist" Apr 16 20:25:23.051236 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.051214 2570 scope.go:117] "RemoveContainer" containerID="c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf" Apr 16 20:25:23.051447 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:25:23.051428 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf\": container with ID starting with c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf not found: ID does not exist" containerID="c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf" Apr 16 20:25:23.051485 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.051454 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf"} err="failed to get container status \"c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf\": rpc error: code = NotFound desc = could not find container \"c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf\": container with ID starting with c11e9a353a7472462139c6dc8d6e4b5290559cd7c04228405955b8b98c61adaf not found: ID does not exist" Apr 16 20:25:23.070428 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.070399 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx"] Apr 16 20:25:23.074971 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.074933 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8bbbb8-predictor-b9c9d44dd-lxjbx"] Apr 16 20:25:23.113014 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:23.112978 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" path="/var/lib/kubelet/pods/bed6218e-4fa8-4d4b-a64a-39cc33b6fff9/volumes" Apr 16 20:25:26.045972 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:26.045940 2570 generic.go:358] "Generic (PLEG): container finished" podID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerID="d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5" exitCode=0 Apr 16 20:25:26.046410 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:26.046015 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" event={"ID":"cec59a21-153e-4cc9-b5ce-dbada8f3eda5","Type":"ContainerDied","Data":"d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5"} Apr 16 20:25:46.126809 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:46.126712 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" event={"ID":"cec59a21-153e-4cc9-b5ce-dbada8f3eda5","Type":"ContainerStarted","Data":"cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977"} Apr 16 20:25:46.127318 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:46.127074 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:25:46.128264 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:46.128234 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:25:46.143454 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:46.143403 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podStartSLOduration=5.335418936 podStartE2EDuration="25.143388656s" podCreationTimestamp="2026-04-16 20:25:21 +0000 UTC" firstStartedPulling="2026-04-16 20:25:26.047325318 +0000 UTC m=+1889.582191332" lastFinishedPulling="2026-04-16 20:25:45.855295038 +0000 UTC m=+1909.390161052" observedRunningTime="2026-04-16 20:25:46.142593242 +0000 UTC m=+1909.677459305" watchObservedRunningTime="2026-04-16 20:25:46.143388656 +0000 UTC m=+1909.678254693" Apr 16 20:25:47.130351 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:47.130306 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:25:57.131078 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:25:57.130995 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:26:07.130863 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:26:07.130806 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:26:17.131167 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:26:17.131123 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:26:27.130865 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:26:27.130818 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:26:37.131044 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:26:37.130993 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:26:47.130339 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:26:47.130251 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:26:57.130737 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:26:57.130693 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 20:27:07.131144 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:07.131109 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:27:11.217770 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.217731 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr"] Apr 16 20:27:11.218283 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.218092 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" containerID="cri-o://cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977" gracePeriod=30 Apr 16 20:27:11.339133 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.339096 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq"] Apr 16 20:27:11.339506 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.339492 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerName="storage-initializer" Apr 16 20:27:11.339556 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.339508 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerName="storage-initializer" Apr 16 20:27:11.339556 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.339524 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerName="storage-initializer" Apr 16 20:27:11.339556 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.339529 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerName="storage-initializer" Apr 16 20:27:11.339648 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.339595 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerName="storage-initializer" Apr 16 20:27:11.339648 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.339606 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bed6218e-4fa8-4d4b-a64a-39cc33b6fff9" containerName="storage-initializer" Apr 16 20:27:11.342758 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.342740 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:27:11.350431 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.350409 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq"] Apr 16 20:27:11.466539 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.466500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34af86b-fe28-4018-a2a5-c473364d59ac-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq\" (UID: \"c34af86b-fe28-4018-a2a5-c473364d59ac\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:27:11.567867 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.567824 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34af86b-fe28-4018-a2a5-c473364d59ac-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq\" (UID: \"c34af86b-fe28-4018-a2a5-c473364d59ac\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:27:11.568327 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.568306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34af86b-fe28-4018-a2a5-c473364d59ac-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq\" (UID: \"c34af86b-fe28-4018-a2a5-c473364d59ac\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:27:11.654480 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.654438 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:27:11.783826 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:11.783795 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq"] Apr 16 20:27:11.785923 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:27:11.785892 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc34af86b_fe28_4018_a2a5_c473364d59ac.slice/crio-bdfa5eb178332b5d5a430a95c601b3c630810d39716043b71f0207574624fd99 WatchSource:0}: Error finding container bdfa5eb178332b5d5a430a95c601b3c630810d39716043b71f0207574624fd99: Status 404 returned error can't find the container with id bdfa5eb178332b5d5a430a95c601b3c630810d39716043b71f0207574624fd99 Apr 16 20:27:12.440984 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:12.440943 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" event={"ID":"c34af86b-fe28-4018-a2a5-c473364d59ac","Type":"ContainerStarted","Data":"cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f"} Apr 16 20:27:12.440984 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:12.440986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" event={"ID":"c34af86b-fe28-4018-a2a5-c473364d59ac","Type":"ContainerStarted","Data":"bdfa5eb178332b5d5a430a95c601b3c630810d39716043b71f0207574624fd99"} Apr 16 20:27:15.452269 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:15.452235 2570 generic.go:358] "Generic (PLEG): container finished" podID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerID="cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f" exitCode=0 Apr 16 20:27:15.452667 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:15.452322 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" event={"ID":"c34af86b-fe28-4018-a2a5-c473364d59ac","Type":"ContainerDied","Data":"cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f"} Apr 16 20:27:16.265881 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.265856 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:27:16.415028 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.414921 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cec59a21-153e-4cc9-b5ce-dbada8f3eda5-kserve-provision-location\") pod \"cec59a21-153e-4cc9-b5ce-dbada8f3eda5\" (UID: \"cec59a21-153e-4cc9-b5ce-dbada8f3eda5\") " Apr 16 20:27:16.415307 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.415282 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec59a21-153e-4cc9-b5ce-dbada8f3eda5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cec59a21-153e-4cc9-b5ce-dbada8f3eda5" (UID: "cec59a21-153e-4cc9-b5ce-dbada8f3eda5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:16.457601 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.457563 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" event={"ID":"c34af86b-fe28-4018-a2a5-c473364d59ac","Type":"ContainerStarted","Data":"5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be"} Apr 16 20:27:16.458102 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.457885 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:27:16.459039 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.459013 2570 generic.go:358] "Generic (PLEG): container finished" podID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerID="cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977" exitCode=0 Apr 16 20:27:16.459158 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.459050 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" event={"ID":"cec59a21-153e-4cc9-b5ce-dbada8f3eda5","Type":"ContainerDied","Data":"cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977"} Apr 16 20:27:16.459158 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.459086 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" event={"ID":"cec59a21-153e-4cc9-b5ce-dbada8f3eda5","Type":"ContainerDied","Data":"87349d7ba9bdc2019150647eb35b9c45f533c32c873fd5d5663100f965f0b182"} Apr 16 20:27:16.459158 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.459102 2570 scope.go:117] "RemoveContainer" containerID="cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977" Apr 16 20:27:16.459158 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.459128 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr" Apr 16 20:27:16.459659 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.459629 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:27:16.467552 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.467530 2570 scope.go:117] "RemoveContainer" containerID="d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5" Apr 16 20:27:16.474211 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.474159 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podStartSLOduration=5.474144681 podStartE2EDuration="5.474144681s" podCreationTimestamp="2026-04-16 20:27:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:16.473301408 +0000 UTC m=+2000.008167445" watchObservedRunningTime="2026-04-16 20:27:16.474144681 +0000 UTC m=+2000.009010718" Apr 16 20:27:16.475713 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.475690 2570 scope.go:117] "RemoveContainer" containerID="cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977" Apr 16 20:27:16.475974 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:27:16.475953 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977\": container with ID starting with cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977 not found: ID does not exist" containerID="cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977" Apr 16 20:27:16.476081 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.475987 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977"} err="failed to get container status \"cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977\": rpc error: code = NotFound desc = could not find container \"cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977\": container with ID starting with cd8479302e3fe9242778b9e1ed63c414d9cdaa9c0b13abd9aef5e0a020a3b977 not found: ID does not exist" Apr 16 20:27:16.476081 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.476016 2570 scope.go:117] "RemoveContainer" containerID="d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5" Apr 16 20:27:16.476332 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:27:16.476315 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5\": container with ID starting with d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5 not found: ID does not exist" containerID="d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5" Apr 16 20:27:16.476375 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.476340 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5"} err="failed to get container status \"d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5\": rpc error: code = NotFound desc = could not find container \"d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5\": container with ID starting with d866e82d6a510b9bd53d0add447506bd372e2cf90d7ea72577fa3bc0e28ab1e5 not found: ID does not exist" Apr 16 20:27:16.486553 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.486520 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr"] Apr 16 20:27:16.490544 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.490517 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-5xnnr"] Apr 16 20:27:16.516367 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:16.516339 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cec59a21-153e-4cc9-b5ce-dbada8f3eda5-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:27:17.112778 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:17.112739 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" path="/var/lib/kubelet/pods/cec59a21-153e-4cc9-b5ce-dbada8f3eda5/volumes" Apr 16 20:27:17.463718 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:17.463628 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:27:27.464646 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:27.464589 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:27:37.464194 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:37.464143 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:27:47.464335 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:47.464283 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:27:57.464302 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:27:57.464250 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:28:07.464291 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:07.464243 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:28:17.463753 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:17.463649 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:28:27.464571 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:27.464518 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 20:28:37.464985 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:37.464953 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:28:41.454422 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.454387 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq"] Apr 16 20:28:41.454853 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.454638 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" containerID="cri-o://5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be" gracePeriod=30 Apr 16 20:28:41.601265 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.601227 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k"] Apr 16 20:28:41.601680 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.601659 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" Apr 16 20:28:41.601680 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.601679 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" Apr 16 20:28:41.601908 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.601691 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="storage-initializer" Apr 16 20:28:41.601908 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.601700 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="storage-initializer" Apr 16 20:28:41.601908 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.601776 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="cec59a21-153e-4cc9-b5ce-dbada8f3eda5" containerName="kserve-container" Apr 16 20:28:41.605157 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.605135 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:28:41.612937 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.612916 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k"] Apr 16 20:28:41.668445 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.668408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f41bfe-8250-438f-97ca-e63b51d55823-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-pxd8k\" (UID: \"66f41bfe-8250-438f-97ca-e63b51d55823\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:28:41.769760 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.769725 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f41bfe-8250-438f-97ca-e63b51d55823-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-pxd8k\" (UID: \"66f41bfe-8250-438f-97ca-e63b51d55823\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:28:41.770221 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.770194 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f41bfe-8250-438f-97ca-e63b51d55823-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-pxd8k\" (UID: \"66f41bfe-8250-438f-97ca-e63b51d55823\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:28:41.916690 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:41.916647 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:28:42.043077 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:42.043035 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k"] Apr 16 20:28:42.045631 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:28:42.045602 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66f41bfe_8250_438f_97ca_e63b51d55823.slice/crio-d3655cdf893219e873159541eb64d5bf7a3a1016e2eeff783ba3523fbf0f3ed1 WatchSource:0}: Error finding container d3655cdf893219e873159541eb64d5bf7a3a1016e2eeff783ba3523fbf0f3ed1: Status 404 returned error can't find the container with id d3655cdf893219e873159541eb64d5bf7a3a1016e2eeff783ba3523fbf0f3ed1 Apr 16 20:28:42.760895 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:42.760857 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" event={"ID":"66f41bfe-8250-438f-97ca-e63b51d55823","Type":"ContainerStarted","Data":"3f42a3ce00c1e4d523d1ec6cb414b683c1cf45fdf232ac0e5644151cdf079d9b"} Apr 16 20:28:42.760895 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:42.760897 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" event={"ID":"66f41bfe-8250-438f-97ca-e63b51d55823","Type":"ContainerStarted","Data":"d3655cdf893219e873159541eb64d5bf7a3a1016e2eeff783ba3523fbf0f3ed1"} Apr 16 20:28:45.775267 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:45.775232 2570 generic.go:358] "Generic (PLEG): container finished" podID="66f41bfe-8250-438f-97ca-e63b51d55823" containerID="3f42a3ce00c1e4d523d1ec6cb414b683c1cf45fdf232ac0e5644151cdf079d9b" exitCode=0 Apr 16 20:28:45.775852 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:45.775295 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" event={"ID":"66f41bfe-8250-438f-97ca-e63b51d55823","Type":"ContainerDied","Data":"3f42a3ce00c1e4d523d1ec6cb414b683c1cf45fdf232ac0e5644151cdf079d9b"} Apr 16 20:28:46.395733 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.395706 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:28:46.512306 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.512274 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34af86b-fe28-4018-a2a5-c473364d59ac-kserve-provision-location\") pod \"c34af86b-fe28-4018-a2a5-c473364d59ac\" (UID: \"c34af86b-fe28-4018-a2a5-c473364d59ac\") " Apr 16 20:28:46.512658 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.512627 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34af86b-fe28-4018-a2a5-c473364d59ac-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c34af86b-fe28-4018-a2a5-c473364d59ac" (UID: "c34af86b-fe28-4018-a2a5-c473364d59ac"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:46.613886 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.613848 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34af86b-fe28-4018-a2a5-c473364d59ac-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:28:46.781257 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.781160 2570 generic.go:358] "Generic (PLEG): container finished" podID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerID="5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be" exitCode=0 Apr 16 20:28:46.781257 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.781227 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" Apr 16 20:28:46.781728 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.781247 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" event={"ID":"c34af86b-fe28-4018-a2a5-c473364d59ac","Type":"ContainerDied","Data":"5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be"} Apr 16 20:28:46.781728 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.781290 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq" event={"ID":"c34af86b-fe28-4018-a2a5-c473364d59ac","Type":"ContainerDied","Data":"bdfa5eb178332b5d5a430a95c601b3c630810d39716043b71f0207574624fd99"} Apr 16 20:28:46.781728 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.781309 2570 scope.go:117] "RemoveContainer" containerID="5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be" Apr 16 20:28:46.783200 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.783171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" event={"ID":"66f41bfe-8250-438f-97ca-e63b51d55823","Type":"ContainerStarted","Data":"03016ef04f9f96e8af5cd0e6c55c7549091b174d453fb63952e07d03c76e4446"} Apr 16 20:28:46.783489 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.783473 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:28:46.785166 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.785134 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:28:46.790219 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.790199 2570 scope.go:117] "RemoveContainer" containerID="cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f" Apr 16 20:28:46.797576 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.797559 2570 scope.go:117] "RemoveContainer" containerID="5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be" Apr 16 20:28:46.797811 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:28:46.797792 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be\": container with ID starting with 5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be not found: ID does not exist" containerID="5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be" Apr 16 20:28:46.797909 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.797818 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be"} err="failed to get container status \"5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be\": rpc error: code = NotFound desc = could not find container \"5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be\": container with ID starting with 5f3271eb78e199a94c3408e99636d6ffe734db1e42d8308f24ca32b08bc439be not found: ID does not exist" Apr 16 20:28:46.797909 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.797838 2570 scope.go:117] "RemoveContainer" containerID="cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f" Apr 16 20:28:46.798096 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:28:46.798080 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f\": container with ID starting with cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f not found: ID does not exist" containerID="cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f" Apr 16 20:28:46.798141 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.798102 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f"} err="failed to get container status \"cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f\": rpc error: code = NotFound desc = could not find container \"cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f\": container with ID starting with cab51af3204338f725e050a415e3ce1af24540cf404f6c1a48d8c49059c7bb8f not found: ID does not exist" Apr 16 20:28:46.801957 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.801915 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podStartSLOduration=5.801904212 podStartE2EDuration="5.801904212s" podCreationTimestamp="2026-04-16 20:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:28:46.799794402 +0000 UTC m=+2090.334660440" watchObservedRunningTime="2026-04-16 20:28:46.801904212 +0000 UTC m=+2090.336770247" Apr 16 20:28:46.813741 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.813707 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq"] Apr 16 20:28:46.817505 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:46.817479 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hq6zq"] Apr 16 20:28:47.110953 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:47.110873 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" path="/var/lib/kubelet/pods/c34af86b-fe28-4018-a2a5-c473364d59ac/volumes" Apr 16 20:28:47.788286 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:47.788193 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:28:57.788377 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:28:57.788333 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:29:07.788372 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:29:07.788319 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:29:17.789255 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:29:17.789202 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:29:27.788463 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:29:27.788414 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:29:37.788846 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:29:37.788800 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:29:47.788772 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:29:47.788674 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:29:57.789080 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:29:57.789008 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:30:02.108658 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:02.108626 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:30:11.653617 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.653575 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k"] Apr 16 20:30:11.654167 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.653972 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" containerID="cri-o://03016ef04f9f96e8af5cd0e6c55c7549091b174d453fb63952e07d03c76e4446" gracePeriod=30 Apr 16 20:30:11.746329 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.746296 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq"] Apr 16 20:30:11.746700 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.746687 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" Apr 16 20:30:11.746700 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.746701 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" Apr 16 20:30:11.746804 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.746715 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="storage-initializer" Apr 16 20:30:11.746804 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.746720 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="storage-initializer" Apr 16 20:30:11.746804 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.746785 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c34af86b-fe28-4018-a2a5-c473364d59ac" containerName="kserve-container" Apr 16 20:30:11.749818 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.749802 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:30:11.756871 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.756828 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq"] Apr 16 20:30:11.829837 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.829795 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f741b975-ff3d-459d-aac4-1ca49f74318f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq\" (UID: \"f741b975-ff3d-459d-aac4-1ca49f74318f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:30:11.931099 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.930968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f741b975-ff3d-459d-aac4-1ca49f74318f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq\" (UID: \"f741b975-ff3d-459d-aac4-1ca49f74318f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:30:11.931375 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:11.931352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f741b975-ff3d-459d-aac4-1ca49f74318f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq\" (UID: \"f741b975-ff3d-459d-aac4-1ca49f74318f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:30:12.061770 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:12.061733 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:30:12.108319 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:12.108271 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 20:30:12.186411 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:12.186233 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq"] Apr 16 20:30:12.189081 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:30:12.189035 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf741b975_ff3d_459d_aac4_1ca49f74318f.slice/crio-3a157dd861bca2f1274f0a29f502fe512ddc2003b89585605c324985b8f6e707 WatchSource:0}: Error finding container 3a157dd861bca2f1274f0a29f502fe512ddc2003b89585605c324985b8f6e707: Status 404 returned error can't find the container with id 3a157dd861bca2f1274f0a29f502fe512ddc2003b89585605c324985b8f6e707 Apr 16 20:30:12.191154 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:12.191132 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:30:13.072672 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:13.072637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" event={"ID":"f741b975-ff3d-459d-aac4-1ca49f74318f","Type":"ContainerStarted","Data":"a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41"} Apr 16 20:30:13.073127 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:13.072681 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" event={"ID":"f741b975-ff3d-459d-aac4-1ca49f74318f","Type":"ContainerStarted","Data":"3a157dd861bca2f1274f0a29f502fe512ddc2003b89585605c324985b8f6e707"} Apr 16 20:30:16.087168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:16.087130 2570 generic.go:358] "Generic (PLEG): container finished" podID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerID="a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41" exitCode=0 Apr 16 20:30:16.087647 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:16.087206 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" event={"ID":"f741b975-ff3d-459d-aac4-1ca49f74318f","Type":"ContainerDied","Data":"a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41"} Apr 16 20:30:17.092950 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.092919 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" event={"ID":"f741b975-ff3d-459d-aac4-1ca49f74318f","Type":"ContainerStarted","Data":"de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def"} Apr 16 20:30:17.093347 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.093251 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:30:17.094460 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.094438 2570 generic.go:358] "Generic (PLEG): container finished" podID="66f41bfe-8250-438f-97ca-e63b51d55823" containerID="03016ef04f9f96e8af5cd0e6c55c7549091b174d453fb63952e07d03c76e4446" exitCode=0 Apr 16 20:30:17.094579 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.094504 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" event={"ID":"66f41bfe-8250-438f-97ca-e63b51d55823","Type":"ContainerDied","Data":"03016ef04f9f96e8af5cd0e6c55c7549091b174d453fb63952e07d03c76e4446"} Apr 16 20:30:17.094579 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.094537 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" event={"ID":"66f41bfe-8250-438f-97ca-e63b51d55823","Type":"ContainerDied","Data":"d3655cdf893219e873159541eb64d5bf7a3a1016e2eeff783ba3523fbf0f3ed1"} Apr 16 20:30:17.094579 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.094553 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3655cdf893219e873159541eb64d5bf7a3a1016e2eeff783ba3523fbf0f3ed1" Apr 16 20:30:17.103176 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.103158 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:30:17.111628 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.111579 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" podStartSLOduration=6.11156412 podStartE2EDuration="6.11156412s" podCreationTimestamp="2026-04-16 20:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:30:17.111127358 +0000 UTC m=+2180.645993396" watchObservedRunningTime="2026-04-16 20:30:17.11156412 +0000 UTC m=+2180.646430155" Apr 16 20:30:17.177432 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.177400 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f41bfe-8250-438f-97ca-e63b51d55823-kserve-provision-location\") pod \"66f41bfe-8250-438f-97ca-e63b51d55823\" (UID: \"66f41bfe-8250-438f-97ca-e63b51d55823\") " Apr 16 20:30:17.177798 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.177770 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66f41bfe-8250-438f-97ca-e63b51d55823-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "66f41bfe-8250-438f-97ca-e63b51d55823" (UID: "66f41bfe-8250-438f-97ca-e63b51d55823"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:30:17.278753 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:17.278665 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f41bfe-8250-438f-97ca-e63b51d55823-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:30:18.097859 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:18.097822 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k" Apr 16 20:30:18.119107 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:18.119076 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k"] Apr 16 20:30:18.122527 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:18.122501 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-pxd8k"] Apr 16 20:30:19.111589 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:19.111552 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" path="/var/lib/kubelet/pods/66f41bfe-8250-438f-97ca-e63b51d55823/volumes" Apr 16 20:30:48.100318 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:48.100270 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 20:30:58.098836 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:30:58.098787 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 20:31:08.099120 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:08.099078 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 20:31:18.098763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:18.098661 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 20:31:25.107913 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:25.107856 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 20:31:35.112039 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:35.112012 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:31:41.883861 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.883824 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq"] Apr 16 20:31:41.884363 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.884152 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" containerID="cri-o://de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def" gracePeriod=30 Apr 16 20:31:41.982784 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.982748 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg"] Apr 16 20:31:41.983188 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.983174 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" Apr 16 20:31:41.983244 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.983191 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" Apr 16 20:31:41.983244 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.983213 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="storage-initializer" Apr 16 20:31:41.983244 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.983222 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="storage-initializer" Apr 16 20:31:41.983338 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.983281 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="66f41bfe-8250-438f-97ca-e63b51d55823" containerName="kserve-container" Apr 16 20:31:41.986492 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.986469 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:31:41.994880 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:41.994851 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg"] Apr 16 20:31:42.133206 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:42.133156 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0eec79e-c657-4a49-b447-98e4b914c1d9-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg\" (UID: \"a0eec79e-c657-4a49-b447-98e4b914c1d9\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:31:42.234716 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:42.234606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0eec79e-c657-4a49-b447-98e4b914c1d9-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg\" (UID: \"a0eec79e-c657-4a49-b447-98e4b914c1d9\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:31:42.235015 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:42.234990 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0eec79e-c657-4a49-b447-98e4b914c1d9-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg\" (UID: \"a0eec79e-c657-4a49-b447-98e4b914c1d9\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:31:42.297839 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:42.297804 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:31:42.429204 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:42.429178 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg"] Apr 16 20:31:42.431209 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:31:42.431175 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0eec79e_c657_4a49_b447_98e4b914c1d9.slice/crio-2470413c12678b2c6e9bc1dc811ca02f8fd70b1211a0ff21b947387fbe2d1afa WatchSource:0}: Error finding container 2470413c12678b2c6e9bc1dc811ca02f8fd70b1211a0ff21b947387fbe2d1afa: Status 404 returned error can't find the container with id 2470413c12678b2c6e9bc1dc811ca02f8fd70b1211a0ff21b947387fbe2d1afa Apr 16 20:31:43.401336 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:43.401301 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" event={"ID":"a0eec79e-c657-4a49-b447-98e4b914c1d9","Type":"ContainerStarted","Data":"edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c"} Apr 16 20:31:43.401336 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:43.401340 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" event={"ID":"a0eec79e-c657-4a49-b447-98e4b914c1d9","Type":"ContainerStarted","Data":"2470413c12678b2c6e9bc1dc811ca02f8fd70b1211a0ff21b947387fbe2d1afa"} Apr 16 20:31:45.107985 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:45.107932 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 20:31:46.411275 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:46.411238 2570 generic.go:358] "Generic (PLEG): container finished" podID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerID="edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c" exitCode=0 Apr 16 20:31:46.411668 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:46.411312 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" event={"ID":"a0eec79e-c657-4a49-b447-98e4b914c1d9","Type":"ContainerDied","Data":"edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c"} Apr 16 20:31:46.921217 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:46.921193 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:31:46.976998 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:46.976968 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f741b975-ff3d-459d-aac4-1ca49f74318f-kserve-provision-location\") pod \"f741b975-ff3d-459d-aac4-1ca49f74318f\" (UID: \"f741b975-ff3d-459d-aac4-1ca49f74318f\") " Apr 16 20:31:46.977379 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:46.977349 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f741b975-ff3d-459d-aac4-1ca49f74318f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f741b975-ff3d-459d-aac4-1ca49f74318f" (UID: "f741b975-ff3d-459d-aac4-1ca49f74318f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:47.078148 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.078113 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f741b975-ff3d-459d-aac4-1ca49f74318f-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:31:47.416414 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.416314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" event={"ID":"a0eec79e-c657-4a49-b447-98e4b914c1d9","Type":"ContainerStarted","Data":"50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261"} Apr 16 20:31:47.416889 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.416568 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:31:47.417830 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.417805 2570 generic.go:358] "Generic (PLEG): container finished" podID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerID="de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def" exitCode=0 Apr 16 20:31:47.417987 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.417840 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" event={"ID":"f741b975-ff3d-459d-aac4-1ca49f74318f","Type":"ContainerDied","Data":"de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def"} Apr 16 20:31:47.417987 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.417858 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" event={"ID":"f741b975-ff3d-459d-aac4-1ca49f74318f","Type":"ContainerDied","Data":"3a157dd861bca2f1274f0a29f502fe512ddc2003b89585605c324985b8f6e707"} Apr 16 20:31:47.417987 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.417873 2570 scope.go:117] "RemoveContainer" containerID="de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def" Apr 16 20:31:47.417987 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.417876 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq" Apr 16 20:31:47.425871 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.425849 2570 scope.go:117] "RemoveContainer" containerID="a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41" Apr 16 20:31:47.433763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.433733 2570 scope.go:117] "RemoveContainer" containerID="de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def" Apr 16 20:31:47.434096 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:31:47.434075 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def\": container with ID starting with de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def not found: ID does not exist" containerID="de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def" Apr 16 20:31:47.434203 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.434105 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def"} err="failed to get container status \"de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def\": rpc error: code = NotFound desc = could not find container \"de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def\": container with ID starting with de2b048e1b254c243c2394adc2d87c95a5df702603c8297289f51d1416832def not found: ID does not exist" Apr 16 20:31:47.434203 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.434124 2570 scope.go:117] "RemoveContainer" containerID="a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41" Apr 16 20:31:47.434423 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:31:47.434400 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41\": container with ID starting with a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41 not found: ID does not exist" containerID="a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41" Apr 16 20:31:47.434526 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.434429 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41"} err="failed to get container status \"a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41\": rpc error: code = NotFound desc = could not find container \"a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41\": container with ID starting with a983958bd2927c640e948ddb4f94eaa1378c7a744858d8dc7e874ad6175ecb41 not found: ID does not exist" Apr 16 20:31:47.434526 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.434465 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" podStartSLOduration=6.434449015 podStartE2EDuration="6.434449015s" podCreationTimestamp="2026-04-16 20:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:47.432654289 +0000 UTC m=+2270.967520326" watchObservedRunningTime="2026-04-16 20:31:47.434449015 +0000 UTC m=+2270.969315051" Apr 16 20:31:47.445894 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.445868 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq"] Apr 16 20:31:47.448874 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:47.448852 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-wfjjq"] Apr 16 20:31:49.111942 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:31:49.111906 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" path="/var/lib/kubelet/pods/f741b975-ff3d-459d-aac4-1ca49f74318f/volumes" Apr 16 20:32:18.424784 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:32:18.424735 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 20:32:28.424043 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:32:28.423987 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 20:32:38.423736 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:32:38.423680 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 20:32:48.424632 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:32:48.424530 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 20:32:53.106970 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:32:53.106925 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 20:33:03.111652 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:03.111621 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:33:12.107440 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.107399 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg"] Apr 16 20:33:12.107909 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.107713 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" containerID="cri-o://50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261" gracePeriod=30 Apr 16 20:33:12.184551 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.184516 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh"] Apr 16 20:33:12.184924 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.184910 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" Apr 16 20:33:12.184924 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.184924 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" Apr 16 20:33:12.185027 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.184944 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="storage-initializer" Apr 16 20:33:12.185027 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.184952 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="storage-initializer" Apr 16 20:33:12.185122 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.185029 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f741b975-ff3d-459d-aac4-1ca49f74318f" containerName="kserve-container" Apr 16 20:33:12.188275 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.188257 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:33:12.194935 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.194905 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh"] Apr 16 20:33:12.297275 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.297230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df473ae8-004e-4657-a0a2-2778139b9eae-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh\" (UID: \"df473ae8-004e-4657-a0a2-2778139b9eae\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:33:12.398190 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.398096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df473ae8-004e-4657-a0a2-2778139b9eae-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh\" (UID: \"df473ae8-004e-4657-a0a2-2778139b9eae\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:33:12.398489 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.398471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df473ae8-004e-4657-a0a2-2778139b9eae-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh\" (UID: \"df473ae8-004e-4657-a0a2-2778139b9eae\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:33:12.499930 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.499880 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:33:12.625965 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.625939 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh"] Apr 16 20:33:12.628707 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:33:12.628677 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf473ae8_004e_4657_a0a2_2778139b9eae.slice/crio-bd6c7b102b2e2a08cdc24a0e07f8b14cf8d39f19f89e37fec8d75d7a6509037c WatchSource:0}: Error finding container bd6c7b102b2e2a08cdc24a0e07f8b14cf8d39f19f89e37fec8d75d7a6509037c: Status 404 returned error can't find the container with id bd6c7b102b2e2a08cdc24a0e07f8b14cf8d39f19f89e37fec8d75d7a6509037c Apr 16 20:33:12.710459 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.710421 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" event={"ID":"df473ae8-004e-4657-a0a2-2778139b9eae","Type":"ContainerStarted","Data":"fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2"} Apr 16 20:33:12.710459 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:12.710458 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" event={"ID":"df473ae8-004e-4657-a0a2-2778139b9eae","Type":"ContainerStarted","Data":"bd6c7b102b2e2a08cdc24a0e07f8b14cf8d39f19f89e37fec8d75d7a6509037c"} Apr 16 20:33:13.107605 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:13.107562 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 20:33:16.725186 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:16.725095 2570 generic.go:358] "Generic (PLEG): container finished" podID="df473ae8-004e-4657-a0a2-2778139b9eae" containerID="fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2" exitCode=0 Apr 16 20:33:16.725186 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:16.725167 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" event={"ID":"df473ae8-004e-4657-a0a2-2778139b9eae","Type":"ContainerDied","Data":"fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2"} Apr 16 20:33:17.154509 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.154485 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:33:17.238430 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.238388 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0eec79e-c657-4a49-b447-98e4b914c1d9-kserve-provision-location\") pod \"a0eec79e-c657-4a49-b447-98e4b914c1d9\" (UID: \"a0eec79e-c657-4a49-b447-98e4b914c1d9\") " Apr 16 20:33:17.238736 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.238708 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0eec79e-c657-4a49-b447-98e4b914c1d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a0eec79e-c657-4a49-b447-98e4b914c1d9" (UID: "a0eec79e-c657-4a49-b447-98e4b914c1d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:33:17.239186 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.239097 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0eec79e-c657-4a49-b447-98e4b914c1d9-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:33:17.730083 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.730015 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" event={"ID":"df473ae8-004e-4657-a0a2-2778139b9eae","Type":"ContainerStarted","Data":"0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108"} Apr 16 20:33:17.730531 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.730318 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:33:17.731567 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.731544 2570 generic.go:358] "Generic (PLEG): container finished" podID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerID="50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261" exitCode=0 Apr 16 20:33:17.731665 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.731590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" event={"ID":"a0eec79e-c657-4a49-b447-98e4b914c1d9","Type":"ContainerDied","Data":"50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261"} Apr 16 20:33:17.731665 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.731609 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" event={"ID":"a0eec79e-c657-4a49-b447-98e4b914c1d9","Type":"ContainerDied","Data":"2470413c12678b2c6e9bc1dc811ca02f8fd70b1211a0ff21b947387fbe2d1afa"} Apr 16 20:33:17.731665 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.731624 2570 scope.go:117] "RemoveContainer" containerID="50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261" Apr 16 20:33:17.731665 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.731625 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg" Apr 16 20:33:17.741937 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.741915 2570 scope.go:117] "RemoveContainer" containerID="edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c" Apr 16 20:33:17.749018 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.748967 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" podStartSLOduration=5.748949287 podStartE2EDuration="5.748949287s" podCreationTimestamp="2026-04-16 20:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:33:17.745876614 +0000 UTC m=+2361.280742644" watchObservedRunningTime="2026-04-16 20:33:17.748949287 +0000 UTC m=+2361.283815324" Apr 16 20:33:17.751010 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.750994 2570 scope.go:117] "RemoveContainer" containerID="50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261" Apr 16 20:33:17.751297 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:33:17.751276 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261\": container with ID starting with 50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261 not found: ID does not exist" containerID="50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261" Apr 16 20:33:17.751352 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.751305 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261"} err="failed to get container status \"50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261\": rpc error: code = NotFound desc = could not find container \"50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261\": container with ID starting with 50eb7b0a6924da42eee213b520d140d7c1002d5c2e7edb92af142ff96c69f261 not found: ID does not exist" Apr 16 20:33:17.751352 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.751324 2570 scope.go:117] "RemoveContainer" containerID="edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c" Apr 16 20:33:17.751533 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:33:17.751515 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c\": container with ID starting with edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c not found: ID does not exist" containerID="edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c" Apr 16 20:33:17.751581 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.751539 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c"} err="failed to get container status \"edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c\": rpc error: code = NotFound desc = could not find container \"edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c\": container with ID starting with edddf4d9a9a59cc26e0a40602b5dce80d40744a5ddf2de2ff7034ac83b2f9d1c not found: ID does not exist" Apr 16 20:33:17.758882 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.758859 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg"] Apr 16 20:33:17.760457 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:17.760434 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-c77kg"] Apr 16 20:33:19.111738 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:19.111700 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" path="/var/lib/kubelet/pods/a0eec79e-c657-4a49-b447-98e4b914c1d9/volumes" Apr 16 20:33:48.738148 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:48.738104 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 20:33:58.737359 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:33:58.737312 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 20:34:08.737022 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:08.736969 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 20:34:18.737555 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:18.737401 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 20:34:23.107421 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:23.107377 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 20:34:33.113324 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:33.113293 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:34:42.315726 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:42.315686 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh"] Apr 16 20:34:42.316161 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:42.315950 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" containerID="cri-o://0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108" gracePeriod=30 Apr 16 20:34:43.108128 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:43.108077 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 20:34:44.507035 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.506991 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4"] Apr 16 20:34:44.507577 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.507553 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="storage-initializer" Apr 16 20:34:44.507652 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.507581 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="storage-initializer" Apr 16 20:34:44.507652 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.507596 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" Apr 16 20:34:44.507652 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.507606 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" Apr 16 20:34:44.507811 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.507719 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0eec79e-c657-4a49-b447-98e4b914c1d9" containerName="kserve-container" Apr 16 20:34:44.510821 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.510801 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:34:44.517782 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.517755 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4"] Apr 16 20:34:44.612417 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.612387 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b5b777f74-rllg4\" (UID: \"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:34:44.713130 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.713050 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b5b777f74-rllg4\" (UID: \"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:34:44.713492 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.713465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b5b777f74-rllg4\" (UID: \"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:34:44.822907 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.822814 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:34:44.948323 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:44.948294 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4"] Apr 16 20:34:44.950869 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:34:44.950837 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod305d7d2e_59ea_4bb6_9c0e_a2c52bb9b713.slice/crio-cd4f8e0f7450a4d6b59d1dfa0e58b497b778fff26db4163d96cccdd92c6a6db0 WatchSource:0}: Error finding container cd4f8e0f7450a4d6b59d1dfa0e58b497b778fff26db4163d96cccdd92c6a6db0: Status 404 returned error can't find the container with id cd4f8e0f7450a4d6b59d1dfa0e58b497b778fff26db4163d96cccdd92c6a6db0 Apr 16 20:34:45.045405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:45.045367 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" event={"ID":"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713","Type":"ContainerStarted","Data":"18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd"} Apr 16 20:34:45.045578 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:45.045413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" event={"ID":"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713","Type":"ContainerStarted","Data":"cd4f8e0f7450a4d6b59d1dfa0e58b497b778fff26db4163d96cccdd92c6a6db0"} Apr 16 20:34:47.762937 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:47.762908 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:34:47.839327 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:47.839285 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df473ae8-004e-4657-a0a2-2778139b9eae-kserve-provision-location\") pod \"df473ae8-004e-4657-a0a2-2778139b9eae\" (UID: \"df473ae8-004e-4657-a0a2-2778139b9eae\") " Apr 16 20:34:47.839643 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:47.839614 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df473ae8-004e-4657-a0a2-2778139b9eae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df473ae8-004e-4657-a0a2-2778139b9eae" (UID: "df473ae8-004e-4657-a0a2-2778139b9eae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:47.940530 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:47.940428 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df473ae8-004e-4657-a0a2-2778139b9eae-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:34:48.057255 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.057218 2570 generic.go:358] "Generic (PLEG): container finished" podID="df473ae8-004e-4657-a0a2-2778139b9eae" containerID="0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108" exitCode=0 Apr 16 20:34:48.057414 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.057281 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" event={"ID":"df473ae8-004e-4657-a0a2-2778139b9eae","Type":"ContainerDied","Data":"0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108"} Apr 16 20:34:48.057414 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.057317 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" event={"ID":"df473ae8-004e-4657-a0a2-2778139b9eae","Type":"ContainerDied","Data":"bd6c7b102b2e2a08cdc24a0e07f8b14cf8d39f19f89e37fec8d75d7a6509037c"} Apr 16 20:34:48.057414 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.057334 2570 scope.go:117] "RemoveContainer" containerID="0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108" Apr 16 20:34:48.057414 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.057349 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh" Apr 16 20:34:48.065474 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.065452 2570 scope.go:117] "RemoveContainer" containerID="fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2" Apr 16 20:34:48.073300 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.073276 2570 scope.go:117] "RemoveContainer" containerID="0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108" Apr 16 20:34:48.073623 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:34:48.073601 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108\": container with ID starting with 0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108 not found: ID does not exist" containerID="0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108" Apr 16 20:34:48.073676 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.073632 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108"} err="failed to get container status \"0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108\": rpc error: code = NotFound desc = could not find container \"0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108\": container with ID starting with 0803feb4bc0cdc4e86c7ab2656f8aca08d874b75ac183859db0411d1afa71108 not found: ID does not exist" Apr 16 20:34:48.073676 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.073653 2570 scope.go:117] "RemoveContainer" containerID="fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2" Apr 16 20:34:48.073871 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:34:48.073853 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2\": container with ID starting with fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2 not found: ID does not exist" containerID="fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2" Apr 16 20:34:48.073914 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.073879 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2"} err="failed to get container status \"fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2\": rpc error: code = NotFound desc = could not find container \"fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2\": container with ID starting with fb7e199c6fecacb1d8dfa9875ae19df54b31d7681f72046002bd190ec6de13f2 not found: ID does not exist" Apr 16 20:34:48.082121 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.082093 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh"] Apr 16 20:34:48.086043 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:48.086014 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-ggzlh"] Apr 16 20:34:49.062435 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:49.062403 2570 generic.go:358] "Generic (PLEG): container finished" podID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerID="18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd" exitCode=0 Apr 16 20:34:49.062817 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:49.062476 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" event={"ID":"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713","Type":"ContainerDied","Data":"18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd"} Apr 16 20:34:49.112431 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:49.112394 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" path="/var/lib/kubelet/pods/df473ae8-004e-4657-a0a2-2778139b9eae/volumes" Apr 16 20:34:50.068178 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:50.068139 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" event={"ID":"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713","Type":"ContainerStarted","Data":"2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04"} Apr 16 20:34:50.068587 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:50.068419 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:34:50.069852 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:50.069827 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:34:50.084899 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:50.084850 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podStartSLOduration=6.084836856 podStartE2EDuration="6.084836856s" podCreationTimestamp="2026-04-16 20:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:34:50.083088725 +0000 UTC m=+2453.617954761" watchObservedRunningTime="2026-04-16 20:34:50.084836856 +0000 UTC m=+2453.619702890" Apr 16 20:34:51.071820 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:51.071771 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:34:57.289972 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:57.289939 2570 scope.go:117] "RemoveContainer" containerID="03016ef04f9f96e8af5cd0e6c55c7549091b174d453fb63952e07d03c76e4446" Apr 16 20:34:57.297715 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:34:57.297692 2570 scope.go:117] "RemoveContainer" containerID="3f42a3ce00c1e4d523d1ec6cb414b683c1cf45fdf232ac0e5644151cdf079d9b" Apr 16 20:35:01.072369 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:35:01.072324 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:35:11.072621 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:35:11.072570 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:35:21.072571 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:35:21.072528 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:35:31.072810 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:35:31.072763 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:35:41.072747 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:35:41.072698 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:35:51.072364 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:35:51.072275 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:35:59.112441 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:35:59.112411 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:36:04.626067 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.626025 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4"] Apr 16 20:36:04.626618 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.626364 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" containerID="cri-o://2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04" gracePeriod=30 Apr 16 20:36:04.707696 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.707657 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k"] Apr 16 20:36:04.708104 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.708089 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="storage-initializer" Apr 16 20:36:04.708158 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.708106 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="storage-initializer" Apr 16 20:36:04.708158 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.708128 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" Apr 16 20:36:04.708158 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.708134 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" Apr 16 20:36:04.708258 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.708193 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="df473ae8-004e-4657-a0a2-2778139b9eae" containerName="kserve-container" Apr 16 20:36:04.711371 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.711345 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:36:04.713898 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.713873 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k"] Apr 16 20:36:04.815538 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.815495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e33daf85-00d6-4352-a20d-087f20df5bdf-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-6rf9k\" (UID: \"e33daf85-00d6-4352-a20d-087f20df5bdf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:36:04.916890 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.916792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e33daf85-00d6-4352-a20d-087f20df5bdf-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-6rf9k\" (UID: \"e33daf85-00d6-4352-a20d-087f20df5bdf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:36:04.917245 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:04.917222 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e33daf85-00d6-4352-a20d-087f20df5bdf-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-6rf9k\" (UID: \"e33daf85-00d6-4352-a20d-087f20df5bdf\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:36:05.023113 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:05.023075 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:36:05.148435 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:05.148410 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k"] Apr 16 20:36:05.150656 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:36:05.150622 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode33daf85_00d6_4352_a20d_087f20df5bdf.slice/crio-a45a25c7f7b5df3b311735d459c404f9868da24368b0ead2784023915b17edd4 WatchSource:0}: Error finding container a45a25c7f7b5df3b311735d459c404f9868da24368b0ead2784023915b17edd4: Status 404 returned error can't find the container with id a45a25c7f7b5df3b311735d459c404f9868da24368b0ead2784023915b17edd4 Apr 16 20:36:05.152563 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:05.152546 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:36:05.337458 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:05.337423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" event={"ID":"e33daf85-00d6-4352-a20d-087f20df5bdf","Type":"ContainerStarted","Data":"5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016"} Apr 16 20:36:05.337458 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:05.337460 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" event={"ID":"e33daf85-00d6-4352-a20d-087f20df5bdf","Type":"ContainerStarted","Data":"a45a25c7f7b5df3b311735d459c404f9868da24368b0ead2784023915b17edd4"} Apr 16 20:36:09.108597 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.108543 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 20:36:09.281218 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.281191 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:36:09.352695 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.352600 2570 generic.go:358] "Generic (PLEG): container finished" podID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerID="5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016" exitCode=0 Apr 16 20:36:09.352695 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.352647 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" event={"ID":"e33daf85-00d6-4352-a20d-087f20df5bdf","Type":"ContainerDied","Data":"5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016"} Apr 16 20:36:09.354185 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.354159 2570 generic.go:358] "Generic (PLEG): container finished" podID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerID="2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04" exitCode=0 Apr 16 20:36:09.354304 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.354222 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" event={"ID":"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713","Type":"ContainerDied","Data":"2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04"} Apr 16 20:36:09.354304 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.354230 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" Apr 16 20:36:09.354304 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.354255 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4" event={"ID":"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713","Type":"ContainerDied","Data":"cd4f8e0f7450a4d6b59d1dfa0e58b497b778fff26db4163d96cccdd92c6a6db0"} Apr 16 20:36:09.354304 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.354276 2570 scope.go:117] "RemoveContainer" containerID="2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04" Apr 16 20:36:09.358831 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.358806 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713-kserve-provision-location\") pod \"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713\" (UID: \"305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713\") " Apr 16 20:36:09.359169 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.359143 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" (UID: "305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:09.362231 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.362214 2570 scope.go:117] "RemoveContainer" containerID="18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd" Apr 16 20:36:09.370932 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.370909 2570 scope.go:117] "RemoveContainer" containerID="2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04" Apr 16 20:36:09.371301 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:36:09.371280 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04\": container with ID starting with 2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04 not found: ID does not exist" containerID="2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04" Apr 16 20:36:09.371392 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.371310 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04"} err="failed to get container status \"2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04\": rpc error: code = NotFound desc = could not find container \"2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04\": container with ID starting with 2bb6015d856945b61f8dbbc0b43d37c691d3a0706789548a5bb882606cc4cc04 not found: ID does not exist" Apr 16 20:36:09.371392 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.371331 2570 scope.go:117] "RemoveContainer" containerID="18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd" Apr 16 20:36:09.371578 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:36:09.371560 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd\": container with ID starting with 18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd not found: ID does not exist" containerID="18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd" Apr 16 20:36:09.371637 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.371583 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd"} err="failed to get container status \"18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd\": rpc error: code = NotFound desc = could not find container \"18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd\": container with ID starting with 18ea37cc66bf7118b9d7b8da3113ee6443114ab246471a830368d842a96c23bd not found: ID does not exist" Apr 16 20:36:09.460364 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.460328 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:36:09.676300 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.676265 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4"] Apr 16 20:36:09.679550 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:09.679518 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-rllg4"] Apr 16 20:36:10.359494 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:10.359457 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" event={"ID":"e33daf85-00d6-4352-a20d-087f20df5bdf","Type":"ContainerStarted","Data":"eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af"} Apr 16 20:36:10.359992 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:10.359680 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:36:10.378655 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:10.378602 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" podStartSLOduration=6.378586686 podStartE2EDuration="6.378586686s" podCreationTimestamp="2026-04-16 20:36:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:36:10.376397549 +0000 UTC m=+2533.911263576" watchObservedRunningTime="2026-04-16 20:36:10.378586686 +0000 UTC m=+2533.913452721" Apr 16 20:36:11.111900 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:11.111849 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" path="/var/lib/kubelet/pods/305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713/volumes" Apr 16 20:36:41.433123 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:41.433071 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 20:36:51.367593 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:51.367563 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:36:54.887992 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.887952 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k"] Apr 16 20:36:54.888471 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.888262 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerName="kserve-container" containerID="cri-o://eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af" gracePeriod=30 Apr 16 20:36:54.931681 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.931641 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq"] Apr 16 20:36:54.932157 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.932134 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="storage-initializer" Apr 16 20:36:54.932157 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.932153 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="storage-initializer" Apr 16 20:36:54.932333 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.932179 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" Apr 16 20:36:54.932333 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.932185 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" Apr 16 20:36:54.932333 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.932241 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="305d7d2e-59ea-4bb6-9c0e-a2c52bb9b713" containerName="kserve-container" Apr 16 20:36:54.935595 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.935577 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:36:54.942418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.942389 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq"] Apr 16 20:36:54.947228 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:54.947200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/058e893e-34f4-48e6-9ba9-0eaad368ca08-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq\" (UID: \"058e893e-34f4-48e6-9ba9-0eaad368ca08\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:36:55.047998 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:55.047958 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/058e893e-34f4-48e6-9ba9-0eaad368ca08-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq\" (UID: \"058e893e-34f4-48e6-9ba9-0eaad368ca08\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:36:55.048356 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:55.048338 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/058e893e-34f4-48e6-9ba9-0eaad368ca08-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq\" (UID: \"058e893e-34f4-48e6-9ba9-0eaad368ca08\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:36:55.248119 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:55.247993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:36:55.368993 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:55.368965 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq"] Apr 16 20:36:55.371505 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:36:55.371461 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058e893e_34f4_48e6_9ba9_0eaad368ca08.slice/crio-dd362c61613e8141fc713334a7b49a582911a3b78a261050b278674a03c6d2cc WatchSource:0}: Error finding container dd362c61613e8141fc713334a7b49a582911a3b78a261050b278674a03c6d2cc: Status 404 returned error can't find the container with id dd362c61613e8141fc713334a7b49a582911a3b78a261050b278674a03c6d2cc Apr 16 20:36:55.519114 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:55.519070 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" event={"ID":"058e893e-34f4-48e6-9ba9-0eaad368ca08","Type":"ContainerStarted","Data":"5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4"} Apr 16 20:36:55.519114 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:36:55.519120 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" event={"ID":"058e893e-34f4-48e6-9ba9-0eaad368ca08","Type":"ContainerStarted","Data":"dd362c61613e8141fc713334a7b49a582911a3b78a261050b278674a03c6d2cc"} Apr 16 20:37:01.365264 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:01.365214 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.60:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 20:37:01.541067 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:01.541021 2570 generic.go:358] "Generic (PLEG): container finished" podID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerID="5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4" exitCode=0 Apr 16 20:37:01.541244 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:01.541096 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" event={"ID":"058e893e-34f4-48e6-9ba9-0eaad368ca08","Type":"ContainerDied","Data":"5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4"} Apr 16 20:37:02.536993 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.536963 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:37:02.545617 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.545586 2570 generic.go:358] "Generic (PLEG): container finished" podID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerID="eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af" exitCode=0 Apr 16 20:37:02.545772 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.545659 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" Apr 16 20:37:02.545772 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.545670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" event={"ID":"e33daf85-00d6-4352-a20d-087f20df5bdf","Type":"ContainerDied","Data":"eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af"} Apr 16 20:37:02.545772 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.545714 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k" event={"ID":"e33daf85-00d6-4352-a20d-087f20df5bdf","Type":"ContainerDied","Data":"a45a25c7f7b5df3b311735d459c404f9868da24368b0ead2784023915b17edd4"} Apr 16 20:37:02.545772 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.545732 2570 scope.go:117] "RemoveContainer" containerID="eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af" Apr 16 20:37:02.547436 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.547414 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" event={"ID":"058e893e-34f4-48e6-9ba9-0eaad368ca08","Type":"ContainerStarted","Data":"ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91"} Apr 16 20:37:02.547677 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.547660 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:37:02.549107 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.549040 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 20:37:02.554905 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.554883 2570 scope.go:117] "RemoveContainer" containerID="5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016" Apr 16 20:37:02.564870 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.564845 2570 scope.go:117] "RemoveContainer" containerID="eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af" Apr 16 20:37:02.565317 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:37:02.565283 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af\": container with ID starting with eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af not found: ID does not exist" containerID="eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af" Apr 16 20:37:02.565406 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.565328 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af"} err="failed to get container status \"eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af\": rpc error: code = NotFound desc = could not find container \"eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af\": container with ID starting with eb21291a6cbba3ef0f54636badf05399f62a41604a34135e4f565ab8983c36af not found: ID does not exist" Apr 16 20:37:02.565406 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.565348 2570 scope.go:117] "RemoveContainer" containerID="5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016" Apr 16 20:37:02.565651 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:37:02.565634 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016\": container with ID starting with 5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016 not found: ID does not exist" containerID="5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016" Apr 16 20:37:02.565701 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.565656 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016"} err="failed to get container status \"5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016\": rpc error: code = NotFound desc = could not find container \"5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016\": container with ID starting with 5effc3e1406f77b4c983f8b6a5bd1b6a8657d67ce50d94d3223b4f8219462016 not found: ID does not exist" Apr 16 20:37:02.582973 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.582899 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" podStartSLOduration=8.582879498 podStartE2EDuration="8.582879498s" podCreationTimestamp="2026-04-16 20:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:37:02.582314958 +0000 UTC m=+2586.117181001" watchObservedRunningTime="2026-04-16 20:37:02.582879498 +0000 UTC m=+2586.117745557" Apr 16 20:37:02.603623 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.603540 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e33daf85-00d6-4352-a20d-087f20df5bdf-kserve-provision-location\") pod \"e33daf85-00d6-4352-a20d-087f20df5bdf\" (UID: \"e33daf85-00d6-4352-a20d-087f20df5bdf\") " Apr 16 20:37:02.603886 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.603854 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e33daf85-00d6-4352-a20d-087f20df5bdf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e33daf85-00d6-4352-a20d-087f20df5bdf" (UID: "e33daf85-00d6-4352-a20d-087f20df5bdf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:02.704630 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.704592 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e33daf85-00d6-4352-a20d-087f20df5bdf-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:37:02.868878 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.868842 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k"] Apr 16 20:37:02.873291 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:02.873252 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-6rf9k"] Apr 16 20:37:03.111836 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:03.111736 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" path="/var/lib/kubelet/pods/e33daf85-00d6-4352-a20d-087f20df5bdf/volumes" Apr 16 20:37:03.552557 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:03.552518 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 20:37:13.552774 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:13.552718 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 20:37:23.553299 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:23.553224 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:37:31.935426 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:31.935396 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq_058e893e-34f4-48e6-9ba9-0eaad368ca08/kserve-container/0.log" Apr 16 20:37:32.052807 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.052770 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq"] Apr 16 20:37:32.053120 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.053066 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="kserve-container" containerID="cri-o://ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91" gracePeriod=30 Apr 16 20:37:32.125566 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.125525 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj"] Apr 16 20:37:32.125941 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.125929 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerName="kserve-container" Apr 16 20:37:32.125996 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.125943 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerName="kserve-container" Apr 16 20:37:32.125996 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.125968 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerName="storage-initializer" Apr 16 20:37:32.125996 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.125974 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerName="storage-initializer" Apr 16 20:37:32.126131 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.126073 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="e33daf85-00d6-4352-a20d-087f20df5bdf" containerName="kserve-container" Apr 16 20:37:32.130704 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.130680 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:37:32.136348 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.136317 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj"] Apr 16 20:37:32.265642 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.265611 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85b74c0e-88f3-4360-9b75-6444459a2207-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj\" (UID: \"85b74c0e-88f3-4360-9b75-6444459a2207\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:37:32.366905 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.366871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85b74c0e-88f3-4360-9b75-6444459a2207-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj\" (UID: \"85b74c0e-88f3-4360-9b75-6444459a2207\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:37:32.367285 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.367266 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85b74c0e-88f3-4360-9b75-6444459a2207-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj\" (UID: \"85b74c0e-88f3-4360-9b75-6444459a2207\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:37:32.443598 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.443559 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:37:32.569894 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.569710 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj"] Apr 16 20:37:32.572227 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:37:32.572193 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85b74c0e_88f3_4360_9b75_6444459a2207.slice/crio-f0a1e69eeb211925acfd64c1fd3a6ebc2a4c88af22214fcbc8f85ee9caefb955 WatchSource:0}: Error finding container f0a1e69eeb211925acfd64c1fd3a6ebc2a4c88af22214fcbc8f85ee9caefb955: Status 404 returned error can't find the container with id f0a1e69eeb211925acfd64c1fd3a6ebc2a4c88af22214fcbc8f85ee9caefb955 Apr 16 20:37:32.655418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.655375 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" event={"ID":"85b74c0e-88f3-4360-9b75-6444459a2207","Type":"ContainerStarted","Data":"3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099"} Apr 16 20:37:32.655418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:32.655412 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" event={"ID":"85b74c0e-88f3-4360-9b75-6444459a2207","Type":"ContainerStarted","Data":"f0a1e69eeb211925acfd64c1fd3a6ebc2a4c88af22214fcbc8f85ee9caefb955"} Apr 16 20:37:33.297491 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.297469 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:37:33.376852 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.376761 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/058e893e-34f4-48e6-9ba9-0eaad368ca08-kserve-provision-location\") pod \"058e893e-34f4-48e6-9ba9-0eaad368ca08\" (UID: \"058e893e-34f4-48e6-9ba9-0eaad368ca08\") " Apr 16 20:37:33.400790 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.400751 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058e893e-34f4-48e6-9ba9-0eaad368ca08-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "058e893e-34f4-48e6-9ba9-0eaad368ca08" (UID: "058e893e-34f4-48e6-9ba9-0eaad368ca08"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:33.477667 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.477640 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/058e893e-34f4-48e6-9ba9-0eaad368ca08-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:37:33.659909 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.659808 2570 generic.go:358] "Generic (PLEG): container finished" podID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerID="ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91" exitCode=0 Apr 16 20:37:33.659909 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.659889 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" Apr 16 20:37:33.660130 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.659894 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" event={"ID":"058e893e-34f4-48e6-9ba9-0eaad368ca08","Type":"ContainerDied","Data":"ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91"} Apr 16 20:37:33.660130 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.659991 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq" event={"ID":"058e893e-34f4-48e6-9ba9-0eaad368ca08","Type":"ContainerDied","Data":"dd362c61613e8141fc713334a7b49a582911a3b78a261050b278674a03c6d2cc"} Apr 16 20:37:33.660130 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.660007 2570 scope.go:117] "RemoveContainer" containerID="ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91" Apr 16 20:37:33.668629 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.668608 2570 scope.go:117] "RemoveContainer" containerID="5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4" Apr 16 20:37:33.676258 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.676234 2570 scope.go:117] "RemoveContainer" containerID="ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91" Apr 16 20:37:33.676530 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:37:33.676505 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91\": container with ID starting with ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91 not found: ID does not exist" containerID="ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91" Apr 16 20:37:33.676583 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.676541 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91"} err="failed to get container status \"ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91\": rpc error: code = NotFound desc = could not find container \"ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91\": container with ID starting with ed594330356dee8f06597bf17a77d9f5a7181c8278f2de22cf13c76aa7cbda91 not found: ID does not exist" Apr 16 20:37:33.676583 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.676560 2570 scope.go:117] "RemoveContainer" containerID="5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4" Apr 16 20:37:33.676780 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:37:33.676764 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4\": container with ID starting with 5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4 not found: ID does not exist" containerID="5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4" Apr 16 20:37:33.676830 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.676783 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4"} err="failed to get container status \"5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4\": rpc error: code = NotFound desc = could not find container \"5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4\": container with ID starting with 5bb6a92bcfc9dc215b2dda5d1f60ca7407ab122554e6e136427e1c28bb26eae4 not found: ID does not exist" Apr 16 20:37:33.681352 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.681325 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq"] Apr 16 20:37:33.685264 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:33.685234 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-qdrwq"] Apr 16 20:37:35.112019 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:35.111982 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" path="/var/lib/kubelet/pods/058e893e-34f4-48e6-9ba9-0eaad368ca08/volumes" Apr 16 20:37:36.673173 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:36.673133 2570 generic.go:358] "Generic (PLEG): container finished" podID="85b74c0e-88f3-4360-9b75-6444459a2207" containerID="3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099" exitCode=0 Apr 16 20:37:36.673591 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:36.673206 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" event={"ID":"85b74c0e-88f3-4360-9b75-6444459a2207","Type":"ContainerDied","Data":"3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099"} Apr 16 20:37:37.678279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:37.678246 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" event={"ID":"85b74c0e-88f3-4360-9b75-6444459a2207","Type":"ContainerStarted","Data":"a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa"} Apr 16 20:37:37.678772 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:37.678474 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:37:37.696003 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:37:37.695950 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" podStartSLOduration=5.6959324030000005 podStartE2EDuration="5.695932403s" podCreationTimestamp="2026-04-16 20:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:37:37.6947405 +0000 UTC m=+2621.229606549" watchObservedRunningTime="2026-04-16 20:37:37.695932403 +0000 UTC m=+2621.230798418" Apr 16 20:38:08.732315 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:08.732267 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 20:38:18.731867 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:18.731808 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 20:38:28.684040 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:28.684008 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:38:32.244629 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.244588 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj"] Apr 16 20:38:32.245358 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.245294 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="kserve-container" containerID="cri-o://a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa" gracePeriod=30 Apr 16 20:38:32.313895 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.313855 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d"] Apr 16 20:38:32.314274 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.314261 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="kserve-container" Apr 16 20:38:32.314331 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.314275 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="kserve-container" Apr 16 20:38:32.314331 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.314285 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="storage-initializer" Apr 16 20:38:32.314331 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.314291 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="storage-initializer" Apr 16 20:38:32.314457 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.314343 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="058e893e-34f4-48e6-9ba9-0eaad368ca08" containerName="kserve-container" Apr 16 20:38:32.317822 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.317802 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:38:32.327838 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.327809 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d"] Apr 16 20:38:32.387376 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.387342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16f17cf5-68a4-4103-a48d-bb79e8b21404-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7bc9649f6b-68f8d\" (UID: \"16f17cf5-68a4-4103-a48d-bb79e8b21404\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:38:32.488335 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.488289 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16f17cf5-68a4-4103-a48d-bb79e8b21404-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7bc9649f6b-68f8d\" (UID: \"16f17cf5-68a4-4103-a48d-bb79e8b21404\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:38:32.488681 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.488658 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16f17cf5-68a4-4103-a48d-bb79e8b21404-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7bc9649f6b-68f8d\" (UID: \"16f17cf5-68a4-4103-a48d-bb79e8b21404\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:38:32.629183 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.629149 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:38:32.760647 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.760579 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d"] Apr 16 20:38:32.763578 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:38:32.763551 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f17cf5_68a4_4103_a48d_bb79e8b21404.slice/crio-03b85433d24e7a2112c74913d9edef05ce91a90a930709b3ae8576d846926de1 WatchSource:0}: Error finding container 03b85433d24e7a2112c74913d9edef05ce91a90a930709b3ae8576d846926de1: Status 404 returned error can't find the container with id 03b85433d24e7a2112c74913d9edef05ce91a90a930709b3ae8576d846926de1 Apr 16 20:38:32.866233 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.866197 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" event={"ID":"16f17cf5-68a4-4103-a48d-bb79e8b21404","Type":"ContainerStarted","Data":"41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9"} Apr 16 20:38:32.866233 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:32.866233 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" event={"ID":"16f17cf5-68a4-4103-a48d-bb79e8b21404","Type":"ContainerStarted","Data":"03b85433d24e7a2112c74913d9edef05ce91a90a930709b3ae8576d846926de1"} Apr 16 20:38:36.881407 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:36.881314 2570 generic.go:358] "Generic (PLEG): container finished" podID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerID="41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9" exitCode=0 Apr 16 20:38:36.881407 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:36.881384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" event={"ID":"16f17cf5-68a4-4103-a48d-bb79e8b21404","Type":"ContainerDied","Data":"41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9"} Apr 16 20:38:37.886620 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:37.886588 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" event={"ID":"16f17cf5-68a4-4103-a48d-bb79e8b21404","Type":"ContainerStarted","Data":"269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2"} Apr 16 20:38:37.887097 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:37.886870 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:38:37.888293 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:37.888266 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 20:38:37.914117 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:37.914050 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podStartSLOduration=5.914032985 podStartE2EDuration="5.914032985s" podCreationTimestamp="2026-04-16 20:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:38:37.911696411 +0000 UTC m=+2681.446562448" watchObservedRunningTime="2026-04-16 20:38:37.914032985 +0000 UTC m=+2681.448899020" Apr 16 20:38:38.682447 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:38.682400 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.62:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.134.0.62:8080: connect: connection refused" Apr 16 20:38:38.890472 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:38.890436 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 20:38:39.879547 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.879522 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:38:39.894178 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.894141 2570 generic.go:358] "Generic (PLEG): container finished" podID="85b74c0e-88f3-4360-9b75-6444459a2207" containerID="a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa" exitCode=0 Apr 16 20:38:39.894544 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.894221 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" Apr 16 20:38:39.894544 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.894220 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" event={"ID":"85b74c0e-88f3-4360-9b75-6444459a2207","Type":"ContainerDied","Data":"a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa"} Apr 16 20:38:39.894544 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.894333 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj" event={"ID":"85b74c0e-88f3-4360-9b75-6444459a2207","Type":"ContainerDied","Data":"f0a1e69eeb211925acfd64c1fd3a6ebc2a4c88af22214fcbc8f85ee9caefb955"} Apr 16 20:38:39.894544 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.894356 2570 scope.go:117] "RemoveContainer" containerID="a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa" Apr 16 20:38:39.903275 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.903252 2570 scope.go:117] "RemoveContainer" containerID="3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099" Apr 16 20:38:39.910997 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.910974 2570 scope.go:117] "RemoveContainer" containerID="a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa" Apr 16 20:38:39.911287 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:38:39.911263 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa\": container with ID starting with a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa not found: ID does not exist" containerID="a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa" Apr 16 20:38:39.911375 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.911300 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa"} err="failed to get container status \"a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa\": rpc error: code = NotFound desc = could not find container \"a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa\": container with ID starting with a319b3d474005ddac442f76b7077bfac990c9d205985c1992352a056ea8704aa not found: ID does not exist" Apr 16 20:38:39.911375 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.911328 2570 scope.go:117] "RemoveContainer" containerID="3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099" Apr 16 20:38:39.911661 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:38:39.911630 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099\": container with ID starting with 3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099 not found: ID does not exist" containerID="3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099" Apr 16 20:38:39.911719 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:39.911664 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099"} err="failed to get container status \"3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099\": rpc error: code = NotFound desc = could not find container \"3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099\": container with ID starting with 3c4e3f2593d881e34f9d22f43ee748e12adfcd999b8f9a4723f62767dd0ef099 not found: ID does not exist" Apr 16 20:38:40.057855 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:40.057816 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85b74c0e-88f3-4360-9b75-6444459a2207-kserve-provision-location\") pod \"85b74c0e-88f3-4360-9b75-6444459a2207\" (UID: \"85b74c0e-88f3-4360-9b75-6444459a2207\") " Apr 16 20:38:40.058197 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:40.058172 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85b74c0e-88f3-4360-9b75-6444459a2207-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "85b74c0e-88f3-4360-9b75-6444459a2207" (UID: "85b74c0e-88f3-4360-9b75-6444459a2207"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:38:40.158645 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:40.158609 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85b74c0e-88f3-4360-9b75-6444459a2207-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:38:40.216579 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:40.216540 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj"] Apr 16 20:38:40.219937 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:40.219910 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ndrgj"] Apr 16 20:38:41.111792 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:41.111757 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" path="/var/lib/kubelet/pods/85b74c0e-88f3-4360-9b75-6444459a2207/volumes" Apr 16 20:38:48.890629 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:48.890517 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 20:38:58.891269 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:38:58.891220 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 20:39:08.890633 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:08.890584 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 20:39:18.890659 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:18.890604 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 20:39:28.890948 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:28.890902 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 20:39:38.891122 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:38.891075 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 20:39:48.892282 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:48.892240 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:39:52.553669 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.553629 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d"] Apr 16 20:39:52.554113 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.553946 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" containerID="cri-o://269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2" gracePeriod=30 Apr 16 20:39:52.655114 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.655068 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n"] Apr 16 20:39:52.655576 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.655555 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="storage-initializer" Apr 16 20:39:52.655576 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.655578 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="storage-initializer" Apr 16 20:39:52.655767 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.655590 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="kserve-container" Apr 16 20:39:52.655767 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.655599 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="kserve-container" Apr 16 20:39:52.655767 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.655705 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="85b74c0e-88f3-4360-9b75-6444459a2207" containerName="kserve-container" Apr 16 20:39:52.666343 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.666314 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:39:52.667441 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.667411 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n"] Apr 16 20:39:52.788966 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.788919 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b63814-2803-4ced-b82f-2c4d77d4b31d-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n\" (UID: \"87b63814-2803-4ced-b82f-2c4d77d4b31d\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:39:52.890467 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.890361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b63814-2803-4ced-b82f-2c4d77d4b31d-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n\" (UID: \"87b63814-2803-4ced-b82f-2c4d77d4b31d\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:39:52.890751 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.890732 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b63814-2803-4ced-b82f-2c4d77d4b31d-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n\" (UID: \"87b63814-2803-4ced-b82f-2c4d77d4b31d\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:39:52.977130 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:52.977096 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:39:53.112334 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:53.112305 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n"] Apr 16 20:39:53.112708 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:39:53.112683 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b63814_2803_4ced_b82f_2c4d77d4b31d.slice/crio-73ca0d480308264366812b4854f8f52fbe47e6047616a772d237ee2a9b63494d WatchSource:0}: Error finding container 73ca0d480308264366812b4854f8f52fbe47e6047616a772d237ee2a9b63494d: Status 404 returned error can't find the container with id 73ca0d480308264366812b4854f8f52fbe47e6047616a772d237ee2a9b63494d Apr 16 20:39:53.149817 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:53.149731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" event={"ID":"87b63814-2803-4ced-b82f-2c4d77d4b31d","Type":"ContainerStarted","Data":"73ca0d480308264366812b4854f8f52fbe47e6047616a772d237ee2a9b63494d"} Apr 16 20:39:54.155297 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:54.155258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" event={"ID":"87b63814-2803-4ced-b82f-2c4d77d4b31d","Type":"ContainerStarted","Data":"c6c3d288d713a26746392672ca0c4749edbbc5c1401cc8c144391202cb0a529b"} Apr 16 20:39:56.995192 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:56.995167 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:39:57.126891 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.126813 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16f17cf5-68a4-4103-a48d-bb79e8b21404-kserve-provision-location\") pod \"16f17cf5-68a4-4103-a48d-bb79e8b21404\" (UID: \"16f17cf5-68a4-4103-a48d-bb79e8b21404\") " Apr 16 20:39:57.127141 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.127114 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f17cf5-68a4-4103-a48d-bb79e8b21404-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "16f17cf5-68a4-4103-a48d-bb79e8b21404" (UID: "16f17cf5-68a4-4103-a48d-bb79e8b21404"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:39:57.169824 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.169792 2570 generic.go:358] "Generic (PLEG): container finished" podID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerID="269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2" exitCode=0 Apr 16 20:39:57.170024 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.169873 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" event={"ID":"16f17cf5-68a4-4103-a48d-bb79e8b21404","Type":"ContainerDied","Data":"269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2"} Apr 16 20:39:57.170024 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.169904 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" Apr 16 20:39:57.170024 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.169925 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d" event={"ID":"16f17cf5-68a4-4103-a48d-bb79e8b21404","Type":"ContainerDied","Data":"03b85433d24e7a2112c74913d9edef05ce91a90a930709b3ae8576d846926de1"} Apr 16 20:39:57.170024 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.169946 2570 scope.go:117] "RemoveContainer" containerID="269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2" Apr 16 20:39:57.171415 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.171397 2570 generic.go:358] "Generic (PLEG): container finished" podID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerID="c6c3d288d713a26746392672ca0c4749edbbc5c1401cc8c144391202cb0a529b" exitCode=0 Apr 16 20:39:57.171512 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.171463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" event={"ID":"87b63814-2803-4ced-b82f-2c4d77d4b31d","Type":"ContainerDied","Data":"c6c3d288d713a26746392672ca0c4749edbbc5c1401cc8c144391202cb0a529b"} Apr 16 20:39:57.178822 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.178796 2570 scope.go:117] "RemoveContainer" containerID="41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9" Apr 16 20:39:57.187034 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.187011 2570 scope.go:117] "RemoveContainer" containerID="269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2" Apr 16 20:39:57.187404 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:39:57.187380 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2\": container with ID starting with 269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2 not found: ID does not exist" containerID="269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2" Apr 16 20:39:57.187481 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.187414 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2"} err="failed to get container status \"269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2\": rpc error: code = NotFound desc = could not find container \"269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2\": container with ID starting with 269b7ebfba8e7aed5a38f4dfec10d69a855d8a0440f8a16f284659e0e3a14bd2 not found: ID does not exist" Apr 16 20:39:57.187481 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.187436 2570 scope.go:117] "RemoveContainer" containerID="41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9" Apr 16 20:39:57.187707 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:39:57.187683 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9\": container with ID starting with 41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9 not found: ID does not exist" containerID="41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9" Apr 16 20:39:57.187771 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.187717 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9"} err="failed to get container status \"41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9\": rpc error: code = NotFound desc = could not find container \"41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9\": container with ID starting with 41844049d8edf971f30137926150d945fd24602bc629fb6b7961ab0e466d7aa9 not found: ID does not exist" Apr 16 20:39:57.205486 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.205461 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d"] Apr 16 20:39:57.209273 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.209249 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-68f8d"] Apr 16 20:39:57.227554 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:57.227524 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16f17cf5-68a4-4103-a48d-bb79e8b21404-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:39:58.176277 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:58.176245 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" event={"ID":"87b63814-2803-4ced-b82f-2c4d77d4b31d","Type":"ContainerStarted","Data":"079731b26872ade3501e0550bca2dce0766ecd06b96c43a0ad244ba5897480a2"} Apr 16 20:39:58.176714 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:58.176523 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:39:58.178018 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:58.177989 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 20:39:58.197605 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:58.197551 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podStartSLOduration=6.197536254 podStartE2EDuration="6.197536254s" podCreationTimestamp="2026-04-16 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:39:58.195008108 +0000 UTC m=+2761.729874144" watchObservedRunningTime="2026-04-16 20:39:58.197536254 +0000 UTC m=+2761.732402290" Apr 16 20:39:59.111561 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:59.111515 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" path="/var/lib/kubelet/pods/16f17cf5-68a4-4103-a48d-bb79e8b21404/volumes" Apr 16 20:39:59.180378 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:39:59.180337 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 20:40:09.180729 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:40:09.180687 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 20:40:19.180644 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:40:19.180548 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 20:40:29.180316 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:40:29.180246 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 20:40:39.180493 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:40:39.180444 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 20:40:49.180739 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:40:49.180687 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 20:40:59.180415 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:40:59.180370 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 20:41:09.181286 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:09.181247 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:41:12.808280 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.808244 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n"] Apr 16 20:41:12.808761 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.808619 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" containerID="cri-o://079731b26872ade3501e0550bca2dce0766ecd06b96c43a0ad244ba5897480a2" gracePeriod=30 Apr 16 20:41:12.866699 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.866661 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4"] Apr 16 20:41:12.867041 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.867029 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" Apr 16 20:41:12.867111 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.867042 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" Apr 16 20:41:12.867111 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.867065 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="storage-initializer" Apr 16 20:41:12.867111 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.867072 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="storage-initializer" Apr 16 20:41:12.867215 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.867144 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="16f17cf5-68a4-4103-a48d-bb79e8b21404" containerName="kserve-container" Apr 16 20:41:12.870217 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.870195 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:41:12.878976 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.878951 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4"] Apr 16 20:41:12.986769 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:12.986732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5865f82-5841-41ae-8c5e-53bc9c659b8a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-4pcq4\" (UID: \"b5865f82-5841-41ae-8c5e-53bc9c659b8a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:41:13.087996 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:13.087901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5865f82-5841-41ae-8c5e-53bc9c659b8a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-4pcq4\" (UID: \"b5865f82-5841-41ae-8c5e-53bc9c659b8a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:41:13.088373 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:13.088349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5865f82-5841-41ae-8c5e-53bc9c659b8a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-4pcq4\" (UID: \"b5865f82-5841-41ae-8c5e-53bc9c659b8a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:41:13.181166 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:13.181128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:41:13.305177 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:13.305144 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4"] Apr 16 20:41:13.307366 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:41:13.307339 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5865f82_5841_41ae_8c5e_53bc9c659b8a.slice/crio-ab29a4db45d67673df682be7e0c3da49d9e0398d5d7593d4f89b107a2b1f79d1 WatchSource:0}: Error finding container ab29a4db45d67673df682be7e0c3da49d9e0398d5d7593d4f89b107a2b1f79d1: Status 404 returned error can't find the container with id ab29a4db45d67673df682be7e0c3da49d9e0398d5d7593d4f89b107a2b1f79d1 Apr 16 20:41:13.309142 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:13.309124 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:41:13.443206 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:13.443164 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" event={"ID":"b5865f82-5841-41ae-8c5e-53bc9c659b8a","Type":"ContainerStarted","Data":"4c3ca9999296b8d1ccbcbc70ab79ec5d1365e67776a282eb66b85531de6c6945"} Apr 16 20:41:13.443206 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:13.443207 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" event={"ID":"b5865f82-5841-41ae-8c5e-53bc9c659b8a","Type":"ContainerStarted","Data":"ab29a4db45d67673df682be7e0c3da49d9e0398d5d7593d4f89b107a2b1f79d1"} Apr 16 20:41:17.463410 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:17.463376 2570 generic.go:358] "Generic (PLEG): container finished" podID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerID="079731b26872ade3501e0550bca2dce0766ecd06b96c43a0ad244ba5897480a2" exitCode=0 Apr 16 20:41:17.463776 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:17.463448 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" event={"ID":"87b63814-2803-4ced-b82f-2c4d77d4b31d","Type":"ContainerDied","Data":"079731b26872ade3501e0550bca2dce0766ecd06b96c43a0ad244ba5897480a2"} Apr 16 20:41:17.463776 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:17.463487 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" event={"ID":"87b63814-2803-4ced-b82f-2c4d77d4b31d","Type":"ContainerDied","Data":"73ca0d480308264366812b4854f8f52fbe47e6047616a772d237ee2a9b63494d"} Apr 16 20:41:17.463776 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:17.463500 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ca0d480308264366812b4854f8f52fbe47e6047616a772d237ee2a9b63494d" Apr 16 20:41:17.475004 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:17.474979 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:41:17.629489 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:17.629458 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b63814-2803-4ced-b82f-2c4d77d4b31d-kserve-provision-location\") pod \"87b63814-2803-4ced-b82f-2c4d77d4b31d\" (UID: \"87b63814-2803-4ced-b82f-2c4d77d4b31d\") " Apr 16 20:41:17.629786 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:17.629761 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87b63814-2803-4ced-b82f-2c4d77d4b31d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87b63814-2803-4ced-b82f-2c4d77d4b31d" (UID: "87b63814-2803-4ced-b82f-2c4d77d4b31d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:41:17.730282 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:17.730183 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b63814-2803-4ced-b82f-2c4d77d4b31d-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:41:18.468253 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:18.468219 2570 generic.go:358] "Generic (PLEG): container finished" podID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerID="4c3ca9999296b8d1ccbcbc70ab79ec5d1365e67776a282eb66b85531de6c6945" exitCode=0 Apr 16 20:41:18.468662 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:18.468295 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" event={"ID":"b5865f82-5841-41ae-8c5e-53bc9c659b8a","Type":"ContainerDied","Data":"4c3ca9999296b8d1ccbcbc70ab79ec5d1365e67776a282eb66b85531de6c6945"} Apr 16 20:41:18.468662 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:18.468393 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n" Apr 16 20:41:18.500533 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:18.500505 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n"] Apr 16 20:41:18.504395 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:18.504368 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-4rk2n"] Apr 16 20:41:19.113895 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:19.113848 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" path="/var/lib/kubelet/pods/87b63814-2803-4ced-b82f-2c4d77d4b31d/volumes" Apr 16 20:41:22.485383 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:22.485286 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" event={"ID":"b5865f82-5841-41ae-8c5e-53bc9c659b8a","Type":"ContainerStarted","Data":"c59763a6942acadafad83cc9aa6f2a7be7e57e8063d692ac1141b88200208b50"} Apr 16 20:41:22.485826 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:22.485649 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:41:22.486916 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:22.486888 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 20:41:22.503885 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:22.503827 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" podStartSLOduration=6.768915827 podStartE2EDuration="10.503809336s" podCreationTimestamp="2026-04-16 20:41:12 +0000 UTC" firstStartedPulling="2026-04-16 20:41:18.469649047 +0000 UTC m=+2842.004515061" lastFinishedPulling="2026-04-16 20:41:22.204542556 +0000 UTC m=+2845.739408570" observedRunningTime="2026-04-16 20:41:22.501934082 +0000 UTC m=+2846.036800143" watchObservedRunningTime="2026-04-16 20:41:22.503809336 +0000 UTC m=+2846.038675373" Apr 16 20:41:23.489230 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:23.489190 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 20:41:33.490042 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:33.489995 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 20:41:43.490632 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:41:43.490601 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:42:04.182231 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.182141 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4"] Apr 16 20:42:04.183127 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.183088 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="kserve-container" containerID="cri-o://c59763a6942acadafad83cc9aa6f2a7be7e57e8063d692ac1141b88200208b50" gracePeriod=30 Apr 16 20:42:04.230498 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.230465 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv"] Apr 16 20:42:04.230856 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.230843 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="storage-initializer" Apr 16 20:42:04.230899 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.230857 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="storage-initializer" Apr 16 20:42:04.230899 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.230865 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" Apr 16 20:42:04.230899 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.230870 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" Apr 16 20:42:04.230992 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.230940 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="87b63814-2803-4ced-b82f-2c4d77d4b31d" containerName="kserve-container" Apr 16 20:42:04.234050 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.234029 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:42:04.244031 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.243999 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv"] Apr 16 20:42:04.346402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.346353 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d0bb27-c523-4224-ae72-c4219f3d0b32-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv\" (UID: \"87d0bb27-c523-4224-ae72-c4219f3d0b32\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:42:04.447107 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.447011 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d0bb27-c523-4224-ae72-c4219f3d0b32-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv\" (UID: \"87d0bb27-c523-4224-ae72-c4219f3d0b32\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:42:04.447403 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.447381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d0bb27-c523-4224-ae72-c4219f3d0b32-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv\" (UID: \"87d0bb27-c523-4224-ae72-c4219f3d0b32\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:42:04.545651 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.545618 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:42:04.670315 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:04.670289 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv"] Apr 16 20:42:04.672515 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:42:04.672486 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87d0bb27_c523_4224_ae72_c4219f3d0b32.slice/crio-9e446ec99cb6c78fc9b73fffc0bbcb9e344105a9a6e9cbe7971b54150b009ae9 WatchSource:0}: Error finding container 9e446ec99cb6c78fc9b73fffc0bbcb9e344105a9a6e9cbe7971b54150b009ae9: Status 404 returned error can't find the container with id 9e446ec99cb6c78fc9b73fffc0bbcb9e344105a9a6e9cbe7971b54150b009ae9 Apr 16 20:42:05.644600 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:05.644562 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" event={"ID":"87d0bb27-c523-4224-ae72-c4219f3d0b32","Type":"ContainerStarted","Data":"8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329"} Apr 16 20:42:05.644600 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:05.644603 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" event={"ID":"87d0bb27-c523-4224-ae72-c4219f3d0b32","Type":"ContainerStarted","Data":"9e446ec99cb6c78fc9b73fffc0bbcb9e344105a9a6e9cbe7971b54150b009ae9"} Apr 16 20:42:09.659112 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:09.659077 2570 generic.go:358] "Generic (PLEG): container finished" podID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerID="8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329" exitCode=0 Apr 16 20:42:09.659514 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:09.659126 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" event={"ID":"87d0bb27-c523-4224-ae72-c4219f3d0b32","Type":"ContainerDied","Data":"8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329"} Apr 16 20:42:10.664306 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:10.664269 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" event={"ID":"87d0bb27-c523-4224-ae72-c4219f3d0b32","Type":"ContainerStarted","Data":"4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e"} Apr 16 20:42:10.664727 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:10.664560 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:42:10.665785 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:10.665761 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 16 20:42:10.682606 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:10.682555 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" podStartSLOduration=6.682541301 podStartE2EDuration="6.682541301s" podCreationTimestamp="2026-04-16 20:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:42:10.681452432 +0000 UTC m=+2894.216318469" watchObservedRunningTime="2026-04-16 20:42:10.682541301 +0000 UTC m=+2894.217407334" Apr 16 20:42:11.667626 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:11.667588 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 16 20:42:21.668605 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:21.668572 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:42:34.750041 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:34.750005 2570 generic.go:358] "Generic (PLEG): container finished" podID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerID="c59763a6942acadafad83cc9aa6f2a7be7e57e8063d692ac1141b88200208b50" exitCode=137 Apr 16 20:42:34.750437 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:34.750082 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" event={"ID":"b5865f82-5841-41ae-8c5e-53bc9c659b8a","Type":"ContainerDied","Data":"c59763a6942acadafad83cc9aa6f2a7be7e57e8063d692ac1141b88200208b50"} Apr 16 20:42:34.828717 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:34.828690 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:42:34.913379 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:34.913338 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5865f82-5841-41ae-8c5e-53bc9c659b8a-kserve-provision-location\") pod \"b5865f82-5841-41ae-8c5e-53bc9c659b8a\" (UID: \"b5865f82-5841-41ae-8c5e-53bc9c659b8a\") " Apr 16 20:42:34.924250 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:34.924195 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5865f82-5841-41ae-8c5e-53bc9c659b8a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5865f82-5841-41ae-8c5e-53bc9c659b8a" (UID: "b5865f82-5841-41ae-8c5e-53bc9c659b8a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:42:35.014175 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.014133 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5865f82-5841-41ae-8c5e-53bc9c659b8a-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:42:35.757114 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.757078 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" event={"ID":"b5865f82-5841-41ae-8c5e-53bc9c659b8a","Type":"ContainerDied","Data":"ab29a4db45d67673df682be7e0c3da49d9e0398d5d7593d4f89b107a2b1f79d1"} Apr 16 20:42:35.757114 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.757097 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4" Apr 16 20:42:35.757612 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.757131 2570 scope.go:117] "RemoveContainer" containerID="c59763a6942acadafad83cc9aa6f2a7be7e57e8063d692ac1141b88200208b50" Apr 16 20:42:35.767409 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.767350 2570 scope.go:117] "RemoveContainer" containerID="4c3ca9999296b8d1ccbcbc70ab79ec5d1365e67776a282eb66b85531de6c6945" Apr 16 20:42:35.779555 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.779527 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4"] Apr 16 20:42:35.786893 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.786864 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv"] Apr 16 20:42:35.787329 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.787276 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerName="kserve-container" containerID="cri-o://4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e" gracePeriod=30 Apr 16 20:42:35.788798 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.788774 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4pcq4"] Apr 16 20:42:35.832375 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.832332 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g"] Apr 16 20:42:35.832717 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.832704 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="kserve-container" Apr 16 20:42:35.832717 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.832718 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="kserve-container" Apr 16 20:42:35.832845 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.832744 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="storage-initializer" Apr 16 20:42:35.832845 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.832750 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="storage-initializer" Apr 16 20:42:35.832845 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.832809 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" containerName="kserve-container" Apr 16 20:42:35.837262 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.837240 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:42:35.844996 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.844973 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g"] Apr 16 20:42:35.922529 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:35.922483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40f0e09f-2a4f-4af8-83df-412636cfcd3b-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-nd99g\" (UID: \"40f0e09f-2a4f-4af8-83df-412636cfcd3b\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:42:36.024025 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:36.023928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40f0e09f-2a4f-4af8-83df-412636cfcd3b-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-nd99g\" (UID: \"40f0e09f-2a4f-4af8-83df-412636cfcd3b\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:42:36.024327 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:36.024308 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40f0e09f-2a4f-4af8-83df-412636cfcd3b-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-nd99g\" (UID: \"40f0e09f-2a4f-4af8-83df-412636cfcd3b\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:42:36.149001 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:36.148961 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:42:36.277422 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:36.277357 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g"] Apr 16 20:42:36.281361 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:42:36.281323 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f0e09f_2a4f_4af8_83df_412636cfcd3b.slice/crio-8b012478e2f432d02f057797ab7f6ebc30740a8a8f907dbf7d1bd4d08f25b539 WatchSource:0}: Error finding container 8b012478e2f432d02f057797ab7f6ebc30740a8a8f907dbf7d1bd4d08f25b539: Status 404 returned error can't find the container with id 8b012478e2f432d02f057797ab7f6ebc30740a8a8f907dbf7d1bd4d08f25b539 Apr 16 20:42:36.764763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:36.764724 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" event={"ID":"40f0e09f-2a4f-4af8-83df-412636cfcd3b","Type":"ContainerStarted","Data":"8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd"} Apr 16 20:42:36.764763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:36.764766 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" event={"ID":"40f0e09f-2a4f-4af8-83df-412636cfcd3b","Type":"ContainerStarted","Data":"8b012478e2f432d02f057797ab7f6ebc30740a8a8f907dbf7d1bd4d08f25b539"} Apr 16 20:42:37.111894 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:37.111812 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5865f82-5841-41ae-8c5e-53bc9c659b8a" path="/var/lib/kubelet/pods/b5865f82-5841-41ae-8c5e-53bc9c659b8a/volumes" Apr 16 20:42:40.780277 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:40.780240 2570 generic.go:358] "Generic (PLEG): container finished" podID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerID="8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd" exitCode=0 Apr 16 20:42:40.780955 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:42:40.780298 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" event={"ID":"40f0e09f-2a4f-4af8-83df-412636cfcd3b","Type":"ContainerDied","Data":"8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd"} Apr 16 20:43:06.495704 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.495674 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:43:06.538633 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.538017 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d0bb27-c523-4224-ae72-c4219f3d0b32-kserve-provision-location\") pod \"87d0bb27-c523-4224-ae72-c4219f3d0b32\" (UID: \"87d0bb27-c523-4224-ae72-c4219f3d0b32\") " Apr 16 20:43:06.548935 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.548854 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87d0bb27-c523-4224-ae72-c4219f3d0b32-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87d0bb27-c523-4224-ae72-c4219f3d0b32" (UID: "87d0bb27-c523-4224-ae72-c4219f3d0b32"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:43:06.639764 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.639669 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d0bb27-c523-4224-ae72-c4219f3d0b32-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:43:06.927504 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.927417 2570 generic.go:358] "Generic (PLEG): container finished" podID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerID="4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e" exitCode=137 Apr 16 20:43:06.927703 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.927506 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" event={"ID":"87d0bb27-c523-4224-ae72-c4219f3d0b32","Type":"ContainerDied","Data":"4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e"} Apr 16 20:43:06.927703 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.927536 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" event={"ID":"87d0bb27-c523-4224-ae72-c4219f3d0b32","Type":"ContainerDied","Data":"9e446ec99cb6c78fc9b73fffc0bbcb9e344105a9a6e9cbe7971b54150b009ae9"} Apr 16 20:43:06.927703 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.927555 2570 scope.go:117] "RemoveContainer" containerID="4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e" Apr 16 20:43:06.927703 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.927588 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv" Apr 16 20:43:06.939373 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.938954 2570 scope.go:117] "RemoveContainer" containerID="8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329" Apr 16 20:43:06.948945 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.948879 2570 scope.go:117] "RemoveContainer" containerID="4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e" Apr 16 20:43:06.949316 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:43:06.949234 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e\": container with ID starting with 4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e not found: ID does not exist" containerID="4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e" Apr 16 20:43:06.949316 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.949270 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e"} err="failed to get container status \"4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e\": rpc error: code = NotFound desc = could not find container \"4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e\": container with ID starting with 4fc9ab33c26da40ead5888812f7d03163c9b55aa4bd5095bbf2845d3d466a14e not found: ID does not exist" Apr 16 20:43:06.949316 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.949291 2570 scope.go:117] "RemoveContainer" containerID="8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329" Apr 16 20:43:06.949620 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:43:06.949595 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329\": container with ID starting with 8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329 not found: ID does not exist" containerID="8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329" Apr 16 20:43:06.949677 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.949631 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329"} err="failed to get container status \"8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329\": rpc error: code = NotFound desc = could not find container \"8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329\": container with ID starting with 8ed3b6966f170e2e4a2a586c247a59de27ba441c22fe422c8d64e6798028a329 not found: ID does not exist" Apr 16 20:43:06.954743 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.954704 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv"] Apr 16 20:43:06.957313 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:06.957268 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-n2tdv"] Apr 16 20:43:07.114271 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:43:07.114227 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" path="/var/lib/kubelet/pods/87d0bb27-c523-4224-ae72-c4219f3d0b32/volumes" Apr 16 20:44:35.267658 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:35.267612 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" event={"ID":"40f0e09f-2a4f-4af8-83df-412636cfcd3b","Type":"ContainerStarted","Data":"714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03"} Apr 16 20:44:35.268108 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:35.267770 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:44:35.269123 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:35.269096 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 20:44:35.286261 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:35.286214 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" podStartSLOduration=6.143504017 podStartE2EDuration="2m0.286200271s" podCreationTimestamp="2026-04-16 20:42:35 +0000 UTC" firstStartedPulling="2026-04-16 20:42:40.781426061 +0000 UTC m=+2924.316292075" lastFinishedPulling="2026-04-16 20:44:34.924122315 +0000 UTC m=+3038.458988329" observedRunningTime="2026-04-16 20:44:35.285128186 +0000 UTC m=+3038.819994219" watchObservedRunningTime="2026-04-16 20:44:35.286200271 +0000 UTC m=+3038.821066306" Apr 16 20:44:36.271307 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:36.271266 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 20:44:46.272723 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:46.272632 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:44:57.485210 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.485172 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g"] Apr 16 20:44:57.486480 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.486005 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerName="kserve-container" containerID="cri-o://714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03" gracePeriod=30 Apr 16 20:44:57.577550 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.577515 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl"] Apr 16 20:44:57.577903 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.577888 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerName="storage-initializer" Apr 16 20:44:57.577966 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.577905 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerName="storage-initializer" Apr 16 20:44:57.577966 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.577926 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerName="kserve-container" Apr 16 20:44:57.577966 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.577931 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerName="kserve-container" Apr 16 20:44:57.578116 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.577990 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="87d0bb27-c523-4224-ae72-c4219f3d0b32" containerName="kserve-container" Apr 16 20:44:57.582412 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.582393 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:44:57.590530 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.590501 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl"] Apr 16 20:44:57.704370 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.704328 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e49fa01-fe06-4910-aa3d-7cc937266be8-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-ddxwl\" (UID: \"9e49fa01-fe06-4910-aa3d-7cc937266be8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:44:57.805831 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.805792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e49fa01-fe06-4910-aa3d-7cc937266be8-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-ddxwl\" (UID: \"9e49fa01-fe06-4910-aa3d-7cc937266be8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:44:57.806244 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.806224 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e49fa01-fe06-4910-aa3d-7cc937266be8-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-ddxwl\" (UID: \"9e49fa01-fe06-4910-aa3d-7cc937266be8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:44:57.893088 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:57.893016 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:44:58.088944 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:58.088888 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl"] Apr 16 20:44:58.091210 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:44:58.091166 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e49fa01_fe06_4910_aa3d_7cc937266be8.slice/crio-9516e5ba246330fa70b432f3f013e7855427b6b0ab56bd34b2f4b6c1d7306732 WatchSource:0}: Error finding container 9516e5ba246330fa70b432f3f013e7855427b6b0ab56bd34b2f4b6c1d7306732: Status 404 returned error can't find the container with id 9516e5ba246330fa70b432f3f013e7855427b6b0ab56bd34b2f4b6c1d7306732 Apr 16 20:44:58.349010 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:58.348917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" event={"ID":"9e49fa01-fe06-4910-aa3d-7cc937266be8","Type":"ContainerStarted","Data":"2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2"} Apr 16 20:44:58.349010 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:44:58.348959 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" event={"ID":"9e49fa01-fe06-4910-aa3d-7cc937266be8","Type":"ContainerStarted","Data":"9516e5ba246330fa70b432f3f013e7855427b6b0ab56bd34b2f4b6c1d7306732"} Apr 16 20:45:00.131625 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.131597 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:45:00.328977 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.328936 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40f0e09f-2a4f-4af8-83df-412636cfcd3b-kserve-provision-location\") pod \"40f0e09f-2a4f-4af8-83df-412636cfcd3b\" (UID: \"40f0e09f-2a4f-4af8-83df-412636cfcd3b\") " Apr 16 20:45:00.329385 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.329348 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40f0e09f-2a4f-4af8-83df-412636cfcd3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "40f0e09f-2a4f-4af8-83df-412636cfcd3b" (UID: "40f0e09f-2a4f-4af8-83df-412636cfcd3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:45:00.356351 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.356322 2570 generic.go:358] "Generic (PLEG): container finished" podID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerID="714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03" exitCode=0 Apr 16 20:45:00.356477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.356396 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" Apr 16 20:45:00.356477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.356402 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" event={"ID":"40f0e09f-2a4f-4af8-83df-412636cfcd3b","Type":"ContainerDied","Data":"714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03"} Apr 16 20:45:00.356477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.356430 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g" event={"ID":"40f0e09f-2a4f-4af8-83df-412636cfcd3b","Type":"ContainerDied","Data":"8b012478e2f432d02f057797ab7f6ebc30740a8a8f907dbf7d1bd4d08f25b539"} Apr 16 20:45:00.356477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.356445 2570 scope.go:117] "RemoveContainer" containerID="714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03" Apr 16 20:45:00.365558 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.365538 2570 scope.go:117] "RemoveContainer" containerID="8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd" Apr 16 20:45:00.373001 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.372981 2570 scope.go:117] "RemoveContainer" containerID="714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03" Apr 16 20:45:00.373246 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:45:00.373230 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03\": container with ID starting with 714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03 not found: ID does not exist" containerID="714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03" Apr 16 20:45:00.373305 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.373254 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03"} err="failed to get container status \"714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03\": rpc error: code = NotFound desc = could not find container \"714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03\": container with ID starting with 714eeb8dadda4ee8694a1ad4657a0385db82b913965d49d64dcc2600684a1f03 not found: ID does not exist" Apr 16 20:45:00.373305 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.373270 2570 scope.go:117] "RemoveContainer" containerID="8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd" Apr 16 20:45:00.373508 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:45:00.373490 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd\": container with ID starting with 8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd not found: ID does not exist" containerID="8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd" Apr 16 20:45:00.373560 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.373512 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd"} err="failed to get container status \"8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd\": rpc error: code = NotFound desc = could not find container \"8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd\": container with ID starting with 8be09b1734c3e06365cb5986fd894e4d8a511b03039b2973e17b303bc711c2cd not found: ID does not exist" Apr 16 20:45:00.385016 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.384984 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g"] Apr 16 20:45:00.393783 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.393751 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-nd99g"] Apr 16 20:45:00.430183 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:00.430142 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40f0e09f-2a4f-4af8-83df-412636cfcd3b-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:45:01.112115 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:01.112080 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" path="/var/lib/kubelet/pods/40f0e09f-2a4f-4af8-83df-412636cfcd3b/volumes" Apr 16 20:45:02.364423 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:02.364385 2570 generic.go:358] "Generic (PLEG): container finished" podID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerID="2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2" exitCode=0 Apr 16 20:45:02.364855 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:02.364435 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" event={"ID":"9e49fa01-fe06-4910-aa3d-7cc937266be8","Type":"ContainerDied","Data":"2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2"} Apr 16 20:45:22.445795 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:22.445746 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" event={"ID":"9e49fa01-fe06-4910-aa3d-7cc937266be8","Type":"ContainerStarted","Data":"b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c"} Apr 16 20:45:22.446364 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:22.446315 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:45:22.447661 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:22.447632 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 16 20:45:22.462926 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:22.462881 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podStartSLOduration=6.245325944 podStartE2EDuration="25.462867636s" podCreationTimestamp="2026-04-16 20:44:57 +0000 UTC" firstStartedPulling="2026-04-16 20:45:02.365723504 +0000 UTC m=+3065.900589518" lastFinishedPulling="2026-04-16 20:45:21.583265181 +0000 UTC m=+3085.118131210" observedRunningTime="2026-04-16 20:45:22.461887638 +0000 UTC m=+3085.996753673" watchObservedRunningTime="2026-04-16 20:45:22.462867636 +0000 UTC m=+3085.997733672" Apr 16 20:45:23.449631 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:23.449587 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 16 20:45:33.450274 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:33.450228 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 16 20:45:43.450495 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:43.450451 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 16 20:45:53.450212 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:53.450163 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 16 20:45:59.557309 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:59.557268 2570 scope.go:117] "RemoveContainer" containerID="079731b26872ade3501e0550bca2dce0766ecd06b96c43a0ad244ba5897480a2" Apr 16 20:45:59.565540 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:45:59.565504 2570 scope.go:117] "RemoveContainer" containerID="c6c3d288d713a26746392672ca0c4749edbbc5c1401cc8c144391202cb0a529b" Apr 16 20:46:03.450601 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:03.450554 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 16 20:46:13.449789 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:13.449737 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 16 20:46:23.451305 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:23.451227 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:46:27.716698 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.716658 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl"] Apr 16 20:46:27.717177 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.717143 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" containerID="cri-o://b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c" gracePeriod=30 Apr 16 20:46:27.813459 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.813416 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf"] Apr 16 20:46:27.813893 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.813875 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerName="storage-initializer" Apr 16 20:46:27.813958 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.813895 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerName="storage-initializer" Apr 16 20:46:27.813958 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.813904 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerName="kserve-container" Apr 16 20:46:27.813958 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.813910 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerName="kserve-container" Apr 16 20:46:27.814080 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.813981 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="40f0e09f-2a4f-4af8-83df-412636cfcd3b" containerName="kserve-container" Apr 16 20:46:27.817179 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.817157 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:46:27.825406 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.825383 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf"] Apr 16 20:46:27.907527 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:27.907492 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2390c8b1-2bd3-47b9-8e40-6372f56f3017-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf\" (UID: \"2390c8b1-2bd3-47b9-8e40-6372f56f3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:46:28.008912 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:28.008876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2390c8b1-2bd3-47b9-8e40-6372f56f3017-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf\" (UID: \"2390c8b1-2bd3-47b9-8e40-6372f56f3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:46:28.009324 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:28.009301 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2390c8b1-2bd3-47b9-8e40-6372f56f3017-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf\" (UID: \"2390c8b1-2bd3-47b9-8e40-6372f56f3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:46:28.128414 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:28.128380 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:46:28.255349 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:28.255322 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf"] Apr 16 20:46:28.257787 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:46:28.257749 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2390c8b1_2bd3_47b9_8e40_6372f56f3017.slice/crio-d744e3b9be2b2f7fedb404852b60faaaaf8d7ddd0019ad7d25ce5a431d3d1d1d WatchSource:0}: Error finding container d744e3b9be2b2f7fedb404852b60faaaaf8d7ddd0019ad7d25ce5a431d3d1d1d: Status 404 returned error can't find the container with id d744e3b9be2b2f7fedb404852b60faaaaf8d7ddd0019ad7d25ce5a431d3d1d1d Apr 16 20:46:28.259672 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:28.259656 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:46:28.674534 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:28.674446 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" event={"ID":"2390c8b1-2bd3-47b9-8e40-6372f56f3017","Type":"ContainerStarted","Data":"bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658"} Apr 16 20:46:28.674534 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:28.674490 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" event={"ID":"2390c8b1-2bd3-47b9-8e40-6372f56f3017","Type":"ContainerStarted","Data":"d744e3b9be2b2f7fedb404852b60faaaaf8d7ddd0019ad7d25ce5a431d3d1d1d"} Apr 16 20:46:31.455757 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.455732 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:46:31.643397 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.643353 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e49fa01-fe06-4910-aa3d-7cc937266be8-kserve-provision-location\") pod \"9e49fa01-fe06-4910-aa3d-7cc937266be8\" (UID: \"9e49fa01-fe06-4910-aa3d-7cc937266be8\") " Apr 16 20:46:31.643703 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.643676 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e49fa01-fe06-4910-aa3d-7cc937266be8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9e49fa01-fe06-4910-aa3d-7cc937266be8" (UID: "9e49fa01-fe06-4910-aa3d-7cc937266be8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:46:31.687246 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.687214 2570 generic.go:358] "Generic (PLEG): container finished" podID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerID="b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c" exitCode=0 Apr 16 20:46:31.687418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.687271 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" event={"ID":"9e49fa01-fe06-4910-aa3d-7cc937266be8","Type":"ContainerDied","Data":"b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c"} Apr 16 20:46:31.687418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.687294 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" event={"ID":"9e49fa01-fe06-4910-aa3d-7cc937266be8","Type":"ContainerDied","Data":"9516e5ba246330fa70b432f3f013e7855427b6b0ab56bd34b2f4b6c1d7306732"} Apr 16 20:46:31.687418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.687300 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl" Apr 16 20:46:31.687418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.687310 2570 scope.go:117] "RemoveContainer" containerID="b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c" Apr 16 20:46:31.696440 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.696419 2570 scope.go:117] "RemoveContainer" containerID="2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2" Apr 16 20:46:31.703691 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.703657 2570 scope.go:117] "RemoveContainer" containerID="b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c" Apr 16 20:46:31.703954 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:46:31.703934 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c\": container with ID starting with b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c not found: ID does not exist" containerID="b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c" Apr 16 20:46:31.704042 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.703967 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c"} err="failed to get container status \"b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c\": rpc error: code = NotFound desc = could not find container \"b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c\": container with ID starting with b5f4bb31c0ed8aa3fb78d2448641a37239ada31d60a3e22d13844bf33c08209c not found: ID does not exist" Apr 16 20:46:31.704042 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.703999 2570 scope.go:117] "RemoveContainer" containerID="2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2" Apr 16 20:46:31.704330 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:46:31.704312 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2\": container with ID starting with 2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2 not found: ID does not exist" containerID="2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2" Apr 16 20:46:31.704379 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.704335 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2"} err="failed to get container status \"2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2\": rpc error: code = NotFound desc = could not find container \"2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2\": container with ID starting with 2bb7c2f3a046386c7b9fc3bd03482fd472bcbceaff7749cc70b75bd4d4b8aaf2 not found: ID does not exist" Apr 16 20:46:31.710122 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.710032 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl"] Apr 16 20:46:31.714678 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.714659 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-ddxwl"] Apr 16 20:46:31.744453 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:31.744417 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e49fa01-fe06-4910-aa3d-7cc937266be8-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:46:32.693577 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:32.693542 2570 generic.go:358] "Generic (PLEG): container finished" podID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerID="bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658" exitCode=0 Apr 16 20:46:32.693994 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:32.693604 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" event={"ID":"2390c8b1-2bd3-47b9-8e40-6372f56f3017","Type":"ContainerDied","Data":"bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658"} Apr 16 20:46:33.112128 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:33.112094 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" path="/var/lib/kubelet/pods/9e49fa01-fe06-4910-aa3d-7cc937266be8/volumes" Apr 16 20:46:33.700024 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:33.699982 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" event={"ID":"2390c8b1-2bd3-47b9-8e40-6372f56f3017","Type":"ContainerStarted","Data":"80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094"} Apr 16 20:46:33.700541 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:33.700231 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:46:33.717682 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:46:33.717632 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" podStartSLOduration=6.71761807 podStartE2EDuration="6.71761807s" podCreationTimestamp="2026-04-16 20:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:46:33.716187915 +0000 UTC m=+3157.251053950" watchObservedRunningTime="2026-04-16 20:46:33.71761807 +0000 UTC m=+3157.252484106" Apr 16 20:47:04.711939 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:04.711895 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" podUID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.69:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 20:47:14.713838 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:14.713806 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:47:17.903693 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.903658 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf"] Apr 16 20:47:17.904094 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.903974 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" podUID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerName="kserve-container" containerID="cri-o://80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094" gracePeriod=30 Apr 16 20:47:17.947902 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.947858 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l"] Apr 16 20:47:17.948348 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.948329 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="storage-initializer" Apr 16 20:47:17.948455 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.948350 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="storage-initializer" Apr 16 20:47:17.948455 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.948378 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" Apr 16 20:47:17.948455 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.948387 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" Apr 16 20:47:17.948617 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.948472 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e49fa01-fe06-4910-aa3d-7cc937266be8" containerName="kserve-container" Apr 16 20:47:17.951652 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.951631 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:47:17.959171 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:17.959146 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l"] Apr 16 20:47:18.037928 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:18.037883 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a69ae55-7216-4767-a8e2-c3fcbc89897f-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-n9r2l\" (UID: \"5a69ae55-7216-4767-a8e2-c3fcbc89897f\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:47:18.138821 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:18.138767 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a69ae55-7216-4767-a8e2-c3fcbc89897f-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-n9r2l\" (UID: \"5a69ae55-7216-4767-a8e2-c3fcbc89897f\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:47:18.139305 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:18.139278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a69ae55-7216-4767-a8e2-c3fcbc89897f-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-n9r2l\" (UID: \"5a69ae55-7216-4767-a8e2-c3fcbc89897f\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:47:18.262816 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:18.262791 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:47:18.429771 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:18.429742 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l"] Apr 16 20:47:18.432326 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:47:18.432285 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a69ae55_7216_4767_a8e2_c3fcbc89897f.slice/crio-85a8faf8b9ced52f89a3cb4600e909be9317797afbfb966279c6fd475d504fd1 WatchSource:0}: Error finding container 85a8faf8b9ced52f89a3cb4600e909be9317797afbfb966279c6fd475d504fd1: Status 404 returned error can't find the container with id 85a8faf8b9ced52f89a3cb4600e909be9317797afbfb966279c6fd475d504fd1 Apr 16 20:47:18.858925 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:18.858886 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" event={"ID":"5a69ae55-7216-4767-a8e2-c3fcbc89897f","Type":"ContainerStarted","Data":"a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e"} Apr 16 20:47:18.858925 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:18.858927 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" event={"ID":"5a69ae55-7216-4767-a8e2-c3fcbc89897f","Type":"ContainerStarted","Data":"85a8faf8b9ced52f89a3cb4600e909be9317797afbfb966279c6fd475d504fd1"} Apr 16 20:47:22.871962 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:22.871928 2570 generic.go:358] "Generic (PLEG): container finished" podID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerID="a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e" exitCode=0 Apr 16 20:47:22.872405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:22.872003 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" event={"ID":"5a69ae55-7216-4767-a8e2-c3fcbc89897f","Type":"ContainerDied","Data":"a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e"} Apr 16 20:47:23.877014 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:23.876980 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" event={"ID":"5a69ae55-7216-4767-a8e2-c3fcbc89897f","Type":"ContainerStarted","Data":"8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c"} Apr 16 20:47:23.877451 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:23.877225 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:47:23.894692 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:23.894597 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" podStartSLOduration=6.894575332 podStartE2EDuration="6.894575332s" podCreationTimestamp="2026-04-16 20:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:47:23.893223081 +0000 UTC m=+3207.428089117" watchObservedRunningTime="2026-04-16 20:47:23.894575332 +0000 UTC m=+3207.429441370" Apr 16 20:47:24.650127 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.650104 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:47:24.796677 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.796638 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2390c8b1-2bd3-47b9-8e40-6372f56f3017-kserve-provision-location\") pod \"2390c8b1-2bd3-47b9-8e40-6372f56f3017\" (UID: \"2390c8b1-2bd3-47b9-8e40-6372f56f3017\") " Apr 16 20:47:24.796969 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.796944 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2390c8b1-2bd3-47b9-8e40-6372f56f3017-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2390c8b1-2bd3-47b9-8e40-6372f56f3017" (UID: "2390c8b1-2bd3-47b9-8e40-6372f56f3017"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:47:24.884786 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.884747 2570 generic.go:358] "Generic (PLEG): container finished" podID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerID="80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094" exitCode=0 Apr 16 20:47:24.885272 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.884821 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" Apr 16 20:47:24.885272 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.884831 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" event={"ID":"2390c8b1-2bd3-47b9-8e40-6372f56f3017","Type":"ContainerDied","Data":"80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094"} Apr 16 20:47:24.885272 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.884874 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf" event={"ID":"2390c8b1-2bd3-47b9-8e40-6372f56f3017","Type":"ContainerDied","Data":"d744e3b9be2b2f7fedb404852b60faaaaf8d7ddd0019ad7d25ce5a431d3d1d1d"} Apr 16 20:47:24.885272 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.884894 2570 scope.go:117] "RemoveContainer" containerID="80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094" Apr 16 20:47:24.894196 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.894175 2570 scope.go:117] "RemoveContainer" containerID="bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658" Apr 16 20:47:24.897796 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.897770 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2390c8b1-2bd3-47b9-8e40-6372f56f3017-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:47:24.901707 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.901689 2570 scope.go:117] "RemoveContainer" containerID="80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094" Apr 16 20:47:24.901953 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:47:24.901932 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094\": container with ID starting with 80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094 not found: ID does not exist" containerID="80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094" Apr 16 20:47:24.902017 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.901962 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094"} err="failed to get container status \"80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094\": rpc error: code = NotFound desc = could not find container \"80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094\": container with ID starting with 80faf8a91de484873b19b88706f16dcaa22e9b00ce1aac8d59459fdec8b2e094 not found: ID does not exist" Apr 16 20:47:24.902017 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.901982 2570 scope.go:117] "RemoveContainer" containerID="bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658" Apr 16 20:47:24.902240 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:47:24.902222 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658\": container with ID starting with bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658 not found: ID does not exist" containerID="bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658" Apr 16 20:47:24.902281 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.902247 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658"} err="failed to get container status \"bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658\": rpc error: code = NotFound desc = could not find container \"bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658\": container with ID starting with bc72fcd7afa26d9fc7871b6dca1965d80f9b6ef0cade72268c0a482cf8224658 not found: ID does not exist" Apr 16 20:47:24.906652 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.906628 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf"] Apr 16 20:47:24.909978 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:24.909949 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-r4drf"] Apr 16 20:47:25.111717 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:25.111638 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" path="/var/lib/kubelet/pods/2390c8b1-2bd3-47b9-8e40-6372f56f3017/volumes" Apr 16 20:47:54.932397 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:54.932303 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:47:58.174193 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.174157 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l"] Apr 16 20:47:58.174677 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.174481 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" podUID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerName="kserve-container" containerID="cri-o://8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c" gracePeriod=30 Apr 16 20:47:58.228639 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.228604 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67"] Apr 16 20:47:58.229032 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.229017 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerName="kserve-container" Apr 16 20:47:58.229032 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.229033 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerName="kserve-container" Apr 16 20:47:58.229181 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.229043 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerName="storage-initializer" Apr 16 20:47:58.229181 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.229049 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerName="storage-initializer" Apr 16 20:47:58.229181 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.229128 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2390c8b1-2bd3-47b9-8e40-6372f56f3017" containerName="kserve-container" Apr 16 20:47:58.232418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.232401 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:47:58.245946 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.245912 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67"] Apr 16 20:47:58.282372 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.282329 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4818693-7888-48d7-be9d-d94ba214c0a2-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-jpj67\" (UID: \"f4818693-7888-48d7-be9d-d94ba214c0a2\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:47:58.383855 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.383805 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4818693-7888-48d7-be9d-d94ba214c0a2-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-jpj67\" (UID: \"f4818693-7888-48d7-be9d-d94ba214c0a2\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:47:58.384372 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.384342 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4818693-7888-48d7-be9d-d94ba214c0a2-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-jpj67\" (UID: \"f4818693-7888-48d7-be9d-d94ba214c0a2\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:47:58.542915 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.542884 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:47:58.667793 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:58.667764 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67"] Apr 16 20:47:58.670113 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:47:58.670081 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4818693_7888_48d7_be9d_d94ba214c0a2.slice/crio-e25ee2c55d3f35618a3011b9a243b2c7478d4973eaaba919887f9fa42a28324b WatchSource:0}: Error finding container e25ee2c55d3f35618a3011b9a243b2c7478d4973eaaba919887f9fa42a28324b: Status 404 returned error can't find the container with id e25ee2c55d3f35618a3011b9a243b2c7478d4973eaaba919887f9fa42a28324b Apr 16 20:47:59.007832 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:59.007787 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" event={"ID":"f4818693-7888-48d7-be9d-d94ba214c0a2","Type":"ContainerStarted","Data":"7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47"} Apr 16 20:47:59.007832 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:47:59.007836 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" event={"ID":"f4818693-7888-48d7-be9d-d94ba214c0a2","Type":"ContainerStarted","Data":"e25ee2c55d3f35618a3011b9a243b2c7478d4973eaaba919887f9fa42a28324b"} Apr 16 20:48:03.022150 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:03.022114 2570 generic.go:358] "Generic (PLEG): container finished" podID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerID="7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47" exitCode=0 Apr 16 20:48:03.022696 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:03.022193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" event={"ID":"f4818693-7888-48d7-be9d-d94ba214c0a2","Type":"ContainerDied","Data":"7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47"} Apr 16 20:48:04.027330 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:04.027290 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" event={"ID":"f4818693-7888-48d7-be9d-d94ba214c0a2","Type":"ContainerStarted","Data":"011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4"} Apr 16 20:48:04.027734 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:04.027613 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:48:04.028922 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:04.028891 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Apr 16 20:48:04.045711 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:04.045658 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podStartSLOduration=6.045643207 podStartE2EDuration="6.045643207s" podCreationTimestamp="2026-04-16 20:47:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:48:04.043425484 +0000 UTC m=+3247.578291520" watchObservedRunningTime="2026-04-16 20:48:04.045643207 +0000 UTC m=+3247.580509243" Apr 16 20:48:04.886146 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:04.886100 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" podUID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.70:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.134.0.70:8080: connect: connection refused" Apr 16 20:48:05.031623 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:05.031578 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Apr 16 20:48:06.253148 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:06.253124 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:48:06.356847 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:06.356748 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a69ae55-7216-4767-a8e2-c3fcbc89897f-kserve-provision-location\") pod \"5a69ae55-7216-4767-a8e2-c3fcbc89897f\" (UID: \"5a69ae55-7216-4767-a8e2-c3fcbc89897f\") " Apr 16 20:48:06.357180 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:06.357149 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a69ae55-7216-4767-a8e2-c3fcbc89897f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5a69ae55-7216-4767-a8e2-c3fcbc89897f" (UID: "5a69ae55-7216-4767-a8e2-c3fcbc89897f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:48:06.457405 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:06.457363 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a69ae55-7216-4767-a8e2-c3fcbc89897f-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:48:07.040620 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.040581 2570 generic.go:358] "Generic (PLEG): container finished" podID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerID="8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c" exitCode=0 Apr 16 20:48:07.040793 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.040666 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" Apr 16 20:48:07.040793 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.040674 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" event={"ID":"5a69ae55-7216-4767-a8e2-c3fcbc89897f","Type":"ContainerDied","Data":"8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c"} Apr 16 20:48:07.040793 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.040730 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l" event={"ID":"5a69ae55-7216-4767-a8e2-c3fcbc89897f","Type":"ContainerDied","Data":"85a8faf8b9ced52f89a3cb4600e909be9317797afbfb966279c6fd475d504fd1"} Apr 16 20:48:07.040793 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.040753 2570 scope.go:117] "RemoveContainer" containerID="8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c" Apr 16 20:48:07.050478 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.050457 2570 scope.go:117] "RemoveContainer" containerID="a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e" Apr 16 20:48:07.057685 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.057666 2570 scope.go:117] "RemoveContainer" containerID="8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c" Apr 16 20:48:07.057943 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:48:07.057925 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c\": container with ID starting with 8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c not found: ID does not exist" containerID="8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c" Apr 16 20:48:07.057984 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.057955 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c"} err="failed to get container status \"8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c\": rpc error: code = NotFound desc = could not find container \"8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c\": container with ID starting with 8e842ee06ed87ef86059cbc9360b58ac4355fd190b994d99930406b5b82b599c not found: ID does not exist" Apr 16 20:48:07.057984 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.057975 2570 scope.go:117] "RemoveContainer" containerID="a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e" Apr 16 20:48:07.058220 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:48:07.058205 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e\": container with ID starting with a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e not found: ID does not exist" containerID="a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e" Apr 16 20:48:07.058268 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.058224 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e"} err="failed to get container status \"a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e\": rpc error: code = NotFound desc = could not find container \"a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e\": container with ID starting with a8423e9a1ecbb0e8d19b01a33482ed2bce0c6f4afc3e58fca2f0f33cbe4efe1e not found: ID does not exist" Apr 16 20:48:07.063360 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.063338 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l"] Apr 16 20:48:07.067152 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.067127 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-n9r2l"] Apr 16 20:48:07.111504 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:07.111476 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" path="/var/lib/kubelet/pods/5a69ae55-7216-4767-a8e2-c3fcbc89897f/volumes" Apr 16 20:48:15.032622 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:15.032579 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Apr 16 20:48:25.032220 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:25.032166 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Apr 16 20:48:35.032023 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:35.031965 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Apr 16 20:48:45.032570 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:45.032513 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Apr 16 20:48:55.032068 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:48:55.032008 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Apr 16 20:49:05.033463 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:05.033422 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:49:08.366198 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.366166 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67"] Apr 16 20:49:08.366587 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.366455 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" containerID="cri-o://011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4" gracePeriod=30 Apr 16 20:49:08.444002 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.443968 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw"] Apr 16 20:49:08.444437 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.444419 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerName="kserve-container" Apr 16 20:49:08.444533 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.444439 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerName="kserve-container" Apr 16 20:49:08.444533 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.444463 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerName="storage-initializer" Apr 16 20:49:08.444533 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.444471 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerName="storage-initializer" Apr 16 20:49:08.444700 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.444557 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a69ae55-7216-4767-a8e2-c3fcbc89897f" containerName="kserve-container" Apr 16 20:49:08.447693 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.447671 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:49:08.457249 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.457223 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw"] Apr 16 20:49:08.506385 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.506342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af4bcc1c-761f-41a4-9e6f-8bebef27960c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw\" (UID: \"af4bcc1c-761f-41a4-9e6f-8bebef27960c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:49:08.607594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.607557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af4bcc1c-761f-41a4-9e6f-8bebef27960c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw\" (UID: \"af4bcc1c-761f-41a4-9e6f-8bebef27960c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:49:08.607922 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.607901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af4bcc1c-761f-41a4-9e6f-8bebef27960c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw\" (UID: \"af4bcc1c-761f-41a4-9e6f-8bebef27960c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:49:08.758724 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.758688 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:49:08.884433 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:08.884405 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw"] Apr 16 20:49:08.886543 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:49:08.886517 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf4bcc1c_761f_41a4_9e6f_8bebef27960c.slice/crio-7ef7538a259ff54e1d8eb2efbfc282354626e01e2655d2d55ad9e137ffcebcb3 WatchSource:0}: Error finding container 7ef7538a259ff54e1d8eb2efbfc282354626e01e2655d2d55ad9e137ffcebcb3: Status 404 returned error can't find the container with id 7ef7538a259ff54e1d8eb2efbfc282354626e01e2655d2d55ad9e137ffcebcb3 Apr 16 20:49:09.266765 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:09.266730 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" event={"ID":"af4bcc1c-761f-41a4-9e6f-8bebef27960c","Type":"ContainerStarted","Data":"61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce"} Apr 16 20:49:09.266959 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:09.266773 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" event={"ID":"af4bcc1c-761f-41a4-9e6f-8bebef27960c","Type":"ContainerStarted","Data":"7ef7538a259ff54e1d8eb2efbfc282354626e01e2655d2d55ad9e137ffcebcb3"} Apr 16 20:49:12.113105 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.113079 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:49:12.141522 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.141490 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4818693-7888-48d7-be9d-d94ba214c0a2-kserve-provision-location\") pod \"f4818693-7888-48d7-be9d-d94ba214c0a2\" (UID: \"f4818693-7888-48d7-be9d-d94ba214c0a2\") " Apr 16 20:49:12.141804 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.141781 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4818693-7888-48d7-be9d-d94ba214c0a2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f4818693-7888-48d7-be9d-d94ba214c0a2" (UID: "f4818693-7888-48d7-be9d-d94ba214c0a2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:49:12.242979 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.242892 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4818693-7888-48d7-be9d-d94ba214c0a2-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:49:12.278140 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.278109 2570 generic.go:358] "Generic (PLEG): container finished" podID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerID="011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4" exitCode=0 Apr 16 20:49:12.278282 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.278174 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" Apr 16 20:49:12.278282 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.278193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" event={"ID":"f4818693-7888-48d7-be9d-d94ba214c0a2","Type":"ContainerDied","Data":"011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4"} Apr 16 20:49:12.278282 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.278232 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67" event={"ID":"f4818693-7888-48d7-be9d-d94ba214c0a2","Type":"ContainerDied","Data":"e25ee2c55d3f35618a3011b9a243b2c7478d4973eaaba919887f9fa42a28324b"} Apr 16 20:49:12.278282 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.278247 2570 scope.go:117] "RemoveContainer" containerID="011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4" Apr 16 20:49:12.287270 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.287253 2570 scope.go:117] "RemoveContainer" containerID="7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47" Apr 16 20:49:12.294710 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.294692 2570 scope.go:117] "RemoveContainer" containerID="011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4" Apr 16 20:49:12.294975 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:49:12.294955 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4\": container with ID starting with 011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4 not found: ID does not exist" containerID="011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4" Apr 16 20:49:12.295023 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.294987 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4"} err="failed to get container status \"011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4\": rpc error: code = NotFound desc = could not find container \"011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4\": container with ID starting with 011a6d8cfebca726f1988c8affe199ceb72c643939bd502af6f9c600c87fa7b4 not found: ID does not exist" Apr 16 20:49:12.295023 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.295007 2570 scope.go:117] "RemoveContainer" containerID="7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47" Apr 16 20:49:12.295268 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:49:12.295248 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47\": container with ID starting with 7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47 not found: ID does not exist" containerID="7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47" Apr 16 20:49:12.295319 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.295273 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47"} err="failed to get container status \"7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47\": rpc error: code = NotFound desc = could not find container \"7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47\": container with ID starting with 7f71bde13848c24007b329514b66ab5063701c6bc91f334053eb36c175237f47 not found: ID does not exist" Apr 16 20:49:12.300567 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.300543 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67"] Apr 16 20:49:12.304567 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:12.304544 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-jpj67"] Apr 16 20:49:13.112642 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:13.112609 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" path="/var/lib/kubelet/pods/f4818693-7888-48d7-be9d-d94ba214c0a2/volumes" Apr 16 20:49:13.283300 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:13.283267 2570 generic.go:358] "Generic (PLEG): container finished" podID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerID="61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce" exitCode=0 Apr 16 20:49:13.283664 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:13.283338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" event={"ID":"af4bcc1c-761f-41a4-9e6f-8bebef27960c","Type":"ContainerDied","Data":"61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce"} Apr 16 20:49:14.288781 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:14.288734 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" event={"ID":"af4bcc1c-761f-41a4-9e6f-8bebef27960c","Type":"ContainerStarted","Data":"8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f"} Apr 16 20:49:14.289298 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:14.288978 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:49:14.306215 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:14.306171 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" podStartSLOduration=6.306155631 podStartE2EDuration="6.306155631s" podCreationTimestamp="2026-04-16 20:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:49:14.306078525 +0000 UTC m=+3317.840944563" watchObservedRunningTime="2026-04-16 20:49:14.306155631 +0000 UTC m=+3317.841021667" Apr 16 20:49:45.332198 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:45.332140 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 20:49:55.294252 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:49:55.294204 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 20:50:05.295202 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:05.295163 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:50:08.535131 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.535098 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw"] Apr 16 20:50:08.535585 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.535351 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="kserve-container" containerID="cri-o://8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f" gracePeriod=30 Apr 16 20:50:08.606427 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.606386 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x"] Apr 16 20:50:08.606862 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.606843 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" Apr 16 20:50:08.606962 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.606865 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" Apr 16 20:50:08.606962 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.606882 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="storage-initializer" Apr 16 20:50:08.606962 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.606891 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="storage-initializer" Apr 16 20:50:08.607164 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.606979 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4818693-7888-48d7-be9d-d94ba214c0a2" containerName="kserve-container" Apr 16 20:50:08.610216 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.610195 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:50:08.617097 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.617070 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x"] Apr 16 20:50:08.634876 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.634829 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd29d77-c131-4ee3-9216-58b493267eff-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-d5h4x\" (UID: \"2fd29d77-c131-4ee3-9216-58b493267eff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:50:08.736304 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.736265 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd29d77-c131-4ee3-9216-58b493267eff-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-d5h4x\" (UID: \"2fd29d77-c131-4ee3-9216-58b493267eff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:50:08.736673 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.736651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd29d77-c131-4ee3-9216-58b493267eff-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-d5h4x\" (UID: \"2fd29d77-c131-4ee3-9216-58b493267eff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:50:08.921995 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:08.921891 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:50:09.048108 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:09.048076 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x"] Apr 16 20:50:09.051397 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:50:09.051364 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fd29d77_c131_4ee3_9216_58b493267eff.slice/crio-dd1df1e5cbbea420ee271f28d47ea31fcf44ad668818eb775c00931196f84843 WatchSource:0}: Error finding container dd1df1e5cbbea420ee271f28d47ea31fcf44ad668818eb775c00931196f84843: Status 404 returned error can't find the container with id dd1df1e5cbbea420ee271f28d47ea31fcf44ad668818eb775c00931196f84843 Apr 16 20:50:09.480800 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:09.480759 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" event={"ID":"2fd29d77-c131-4ee3-9216-58b493267eff","Type":"ContainerStarted","Data":"20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a"} Apr 16 20:50:09.480800 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:09.480803 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" event={"ID":"2fd29d77-c131-4ee3-9216-58b493267eff","Type":"ContainerStarted","Data":"dd1df1e5cbbea420ee271f28d47ea31fcf44ad668818eb775c00931196f84843"} Apr 16 20:50:13.497715 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:13.497682 2570 generic.go:358] "Generic (PLEG): container finished" podID="2fd29d77-c131-4ee3-9216-58b493267eff" containerID="20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a" exitCode=0 Apr 16 20:50:13.498181 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:13.497761 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" event={"ID":"2fd29d77-c131-4ee3-9216-58b493267eff","Type":"ContainerDied","Data":"20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a"} Apr 16 20:50:14.502970 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:14.502933 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" event={"ID":"2fd29d77-c131-4ee3-9216-58b493267eff","Type":"ContainerStarted","Data":"5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9"} Apr 16 20:50:14.503390 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:14.503221 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:50:14.504544 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:14.504517 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Apr 16 20:50:14.520403 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:14.520360 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podStartSLOduration=6.520348441 podStartE2EDuration="6.520348441s" podCreationTimestamp="2026-04-16 20:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:50:14.518384915 +0000 UTC m=+3378.053250952" watchObservedRunningTime="2026-04-16 20:50:14.520348441 +0000 UTC m=+3378.055214477" Apr 16 20:50:15.293075 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:15.293001 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.72:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.72:8080: connect: connection refused" Apr 16 20:50:15.506696 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:15.506652 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Apr 16 20:50:16.288436 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.288412 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:50:16.406781 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.406680 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af4bcc1c-761f-41a4-9e6f-8bebef27960c-kserve-provision-location\") pod \"af4bcc1c-761f-41a4-9e6f-8bebef27960c\" (UID: \"af4bcc1c-761f-41a4-9e6f-8bebef27960c\") " Apr 16 20:50:16.407086 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.407030 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af4bcc1c-761f-41a4-9e6f-8bebef27960c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "af4bcc1c-761f-41a4-9e6f-8bebef27960c" (UID: "af4bcc1c-761f-41a4-9e6f-8bebef27960c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:50:16.507499 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.507459 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af4bcc1c-761f-41a4-9e6f-8bebef27960c-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:50:16.511404 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.511374 2570 generic.go:358] "Generic (PLEG): container finished" podID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerID="8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f" exitCode=0 Apr 16 20:50:16.511557 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.511447 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" Apr 16 20:50:16.511557 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.511463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" event={"ID":"af4bcc1c-761f-41a4-9e6f-8bebef27960c","Type":"ContainerDied","Data":"8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f"} Apr 16 20:50:16.511557 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.511499 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw" event={"ID":"af4bcc1c-761f-41a4-9e6f-8bebef27960c","Type":"ContainerDied","Data":"7ef7538a259ff54e1d8eb2efbfc282354626e01e2655d2d55ad9e137ffcebcb3"} Apr 16 20:50:16.511557 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.511518 2570 scope.go:117] "RemoveContainer" containerID="8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f" Apr 16 20:50:16.519812 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.519792 2570 scope.go:117] "RemoveContainer" containerID="61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce" Apr 16 20:50:16.526884 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.526857 2570 scope.go:117] "RemoveContainer" containerID="8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f" Apr 16 20:50:16.527147 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:50:16.527126 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f\": container with ID starting with 8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f not found: ID does not exist" containerID="8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f" Apr 16 20:50:16.527216 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.527155 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f"} err="failed to get container status \"8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f\": rpc error: code = NotFound desc = could not find container \"8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f\": container with ID starting with 8855b5123dbe1456ca2cc5743be0523dd0ae8a386d558d4d273c0253dfa73a1f not found: ID does not exist" Apr 16 20:50:16.527216 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.527173 2570 scope.go:117] "RemoveContainer" containerID="61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce" Apr 16 20:50:16.527412 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:50:16.527394 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce\": container with ID starting with 61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce not found: ID does not exist" containerID="61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce" Apr 16 20:50:16.527468 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.527422 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce"} err="failed to get container status \"61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce\": rpc error: code = NotFound desc = could not find container \"61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce\": container with ID starting with 61b9eafe7b2cc2f811734898a96b45a9537c2fb7414938ad0c8901a1cb47c0ce not found: ID does not exist" Apr 16 20:50:16.534635 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.534610 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw"] Apr 16 20:50:16.537214 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:16.537192 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-b5rsw"] Apr 16 20:50:17.111472 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:17.111440 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" path="/var/lib/kubelet/pods/af4bcc1c-761f-41a4-9e6f-8bebef27960c/volumes" Apr 16 20:50:25.507325 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:25.507277 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Apr 16 20:50:35.507523 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:35.507478 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Apr 16 20:50:45.507393 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:45.507291 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Apr 16 20:50:55.506815 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:50:55.506763 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Apr 16 20:51:05.507180 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:05.507140 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Apr 16 20:51:15.507161 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:15.507127 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:51:18.727751 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.727710 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x"] Apr 16 20:51:18.728294 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.728027 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" containerID="cri-o://5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9" gracePeriod=30 Apr 16 20:51:18.783774 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.783736 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc"] Apr 16 20:51:18.784177 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.784162 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="kserve-container" Apr 16 20:51:18.784177 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.784178 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="kserve-container" Apr 16 20:51:18.784279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.784194 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="storage-initializer" Apr 16 20:51:18.784279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.784199 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="storage-initializer" Apr 16 20:51:18.784279 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.784257 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="af4bcc1c-761f-41a4-9e6f-8bebef27960c" containerName="kserve-container" Apr 16 20:51:18.787379 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.787363 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:51:18.789762 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.789734 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 20:51:18.798218 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.798187 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc"] Apr 16 20:51:18.845925 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.845886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d1d0183-ade3-4e91-9a48-848b915ffe06-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc\" (UID: \"3d1d0183-ade3-4e91-9a48-848b915ffe06\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:51:18.946536 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.946494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d1d0183-ade3-4e91-9a48-848b915ffe06-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc\" (UID: \"3d1d0183-ade3-4e91-9a48-848b915ffe06\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:51:18.946898 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:18.946875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d1d0183-ade3-4e91-9a48-848b915ffe06-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc\" (UID: \"3d1d0183-ade3-4e91-9a48-848b915ffe06\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:51:19.098347 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:19.098290 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:51:19.220833 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:19.220807 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc"] Apr 16 20:51:19.223096 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:51:19.223049 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d1d0183_ade3_4e91_9a48_848b915ffe06.slice/crio-dbc5447dacdf156aa0250eb02fa32ec8946ce97f6506c835e7d4d38da859edaa WatchSource:0}: Error finding container dbc5447dacdf156aa0250eb02fa32ec8946ce97f6506c835e7d4d38da859edaa: Status 404 returned error can't find the container with id dbc5447dacdf156aa0250eb02fa32ec8946ce97f6506c835e7d4d38da859edaa Apr 16 20:51:19.742473 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:19.742433 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" event={"ID":"3d1d0183-ade3-4e91-9a48-848b915ffe06","Type":"ContainerStarted","Data":"51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883"} Apr 16 20:51:19.742473 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:19.742477 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" event={"ID":"3d1d0183-ade3-4e91-9a48-848b915ffe06","Type":"ContainerStarted","Data":"dbc5447dacdf156aa0250eb02fa32ec8946ce97f6506c835e7d4d38da859edaa"} Apr 16 20:51:20.747592 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:20.747505 2570 generic.go:358] "Generic (PLEG): container finished" podID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerID="51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883" exitCode=0 Apr 16 20:51:20.747956 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:20.747590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" event={"ID":"3d1d0183-ade3-4e91-9a48-848b915ffe06","Type":"ContainerDied","Data":"51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883"} Apr 16 20:51:21.752882 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:21.752845 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" event={"ID":"3d1d0183-ade3-4e91-9a48-848b915ffe06","Type":"ContainerStarted","Data":"caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f"} Apr 16 20:51:21.753297 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:21.752993 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:51:21.754335 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:21.754303 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:51:21.769619 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:21.769571 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podStartSLOduration=3.769557255 podStartE2EDuration="3.769557255s" podCreationTimestamp="2026-04-16 20:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:51:21.768405499 +0000 UTC m=+3445.303271534" watchObservedRunningTime="2026-04-16 20:51:21.769557255 +0000 UTC m=+3445.304423268" Apr 16 20:51:22.580934 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.580909 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:51:22.680636 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.680551 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd29d77-c131-4ee3-9216-58b493267eff-kserve-provision-location\") pod \"2fd29d77-c131-4ee3-9216-58b493267eff\" (UID: \"2fd29d77-c131-4ee3-9216-58b493267eff\") " Apr 16 20:51:22.680922 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.680897 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fd29d77-c131-4ee3-9216-58b493267eff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2fd29d77-c131-4ee3-9216-58b493267eff" (UID: "2fd29d77-c131-4ee3-9216-58b493267eff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:51:22.760294 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.760257 2570 generic.go:358] "Generic (PLEG): container finished" podID="2fd29d77-c131-4ee3-9216-58b493267eff" containerID="5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9" exitCode=0 Apr 16 20:51:22.760700 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.760326 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" Apr 16 20:51:22.760700 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.760346 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" event={"ID":"2fd29d77-c131-4ee3-9216-58b493267eff","Type":"ContainerDied","Data":"5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9"} Apr 16 20:51:22.760700 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.760388 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x" event={"ID":"2fd29d77-c131-4ee3-9216-58b493267eff","Type":"ContainerDied","Data":"dd1df1e5cbbea420ee271f28d47ea31fcf44ad668818eb775c00931196f84843"} Apr 16 20:51:22.760700 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.760403 2570 scope.go:117] "RemoveContainer" containerID="5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9" Apr 16 20:51:22.760934 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.760783 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:51:22.768717 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.768687 2570 scope.go:117] "RemoveContainer" containerID="20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a" Apr 16 20:51:22.776092 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.776073 2570 scope.go:117] "RemoveContainer" containerID="5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9" Apr 16 20:51:22.776336 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:51:22.776315 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9\": container with ID starting with 5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9 not found: ID does not exist" containerID="5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9" Apr 16 20:51:22.776394 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.776343 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9"} err="failed to get container status \"5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9\": rpc error: code = NotFound desc = could not find container \"5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9\": container with ID starting with 5d33597b2b0537d0b1289616786e9495bceee815dbefe0c1798b281629027fe9 not found: ID does not exist" Apr 16 20:51:22.776394 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.776361 2570 scope.go:117] "RemoveContainer" containerID="20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a" Apr 16 20:51:22.776574 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:51:22.776552 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a\": container with ID starting with 20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a not found: ID does not exist" containerID="20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a" Apr 16 20:51:22.776628 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.776585 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a"} err="failed to get container status \"20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a\": rpc error: code = NotFound desc = could not find container \"20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a\": container with ID starting with 20cf07f49e28ee2b7474c929a9838421c1df72b54f0325e5e875450eb779660a not found: ID does not exist" Apr 16 20:51:22.781539 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.781518 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x"] Apr 16 20:51:22.781824 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.781806 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd29d77-c131-4ee3-9216-58b493267eff-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:51:22.785873 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:22.785854 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d5h4x"] Apr 16 20:51:23.111307 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:23.111269 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" path="/var/lib/kubelet/pods/2fd29d77-c131-4ee3-9216-58b493267eff/volumes" Apr 16 20:51:32.761622 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:32.761576 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:51:42.761368 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:42.761314 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:51:52.761385 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:51:52.761335 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:52:02.761303 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:02.761253 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:52:12.760904 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:12.760854 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:52:22.760811 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:22.760708 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:52:32.762261 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:32.762219 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:52:38.921646 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:38.921603 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc"] Apr 16 20:52:38.922039 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:38.921913 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" containerID="cri-o://caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f" gracePeriod=30 Apr 16 20:52:39.055297 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.055260 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm"] Apr 16 20:52:39.055636 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.055623 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="storage-initializer" Apr 16 20:52:39.055682 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.055638 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="storage-initializer" Apr 16 20:52:39.055682 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.055655 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" Apr 16 20:52:39.055682 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.055661 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" Apr 16 20:52:39.055781 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.055724 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fd29d77-c131-4ee3-9216-58b493267eff" containerName="kserve-container" Apr 16 20:52:39.058784 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.058761 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:39.061624 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.061600 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 20:52:39.068961 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.068934 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm"] Apr 16 20:52:39.161364 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.161327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3bcfa34-7746-401c-81a0-32c076be46e1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm\" (UID: \"a3bcfa34-7746-401c-81a0-32c076be46e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:39.161538 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.161383 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3bcfa34-7746-401c-81a0-32c076be46e1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm\" (UID: \"a3bcfa34-7746-401c-81a0-32c076be46e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:39.262460 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.262422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3bcfa34-7746-401c-81a0-32c076be46e1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm\" (UID: \"a3bcfa34-7746-401c-81a0-32c076be46e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:39.262630 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.262492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3bcfa34-7746-401c-81a0-32c076be46e1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm\" (UID: \"a3bcfa34-7746-401c-81a0-32c076be46e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:39.262927 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.262905 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3bcfa34-7746-401c-81a0-32c076be46e1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm\" (UID: \"a3bcfa34-7746-401c-81a0-32c076be46e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:39.263159 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.263141 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3bcfa34-7746-401c-81a0-32c076be46e1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm\" (UID: \"a3bcfa34-7746-401c-81a0-32c076be46e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:39.370485 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.370442 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:39.492446 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.492414 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm"] Apr 16 20:52:39.494756 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:52:39.494728 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3bcfa34_7746_401c_81a0_32c076be46e1.slice/crio-09f72ac30ed4cd8233d10615bd8eb4014051173e83c93277b4a725ccc1bcfba9 WatchSource:0}: Error finding container 09f72ac30ed4cd8233d10615bd8eb4014051173e83c93277b4a725ccc1bcfba9: Status 404 returned error can't find the container with id 09f72ac30ed4cd8233d10615bd8eb4014051173e83c93277b4a725ccc1bcfba9 Apr 16 20:52:39.496993 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:39.496971 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:52:40.029466 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:40.029432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" event={"ID":"a3bcfa34-7746-401c-81a0-32c076be46e1","Type":"ContainerStarted","Data":"75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595"} Apr 16 20:52:40.029466 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:40.029466 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" event={"ID":"a3bcfa34-7746-401c-81a0-32c076be46e1","Type":"ContainerStarted","Data":"09f72ac30ed4cd8233d10615bd8eb4014051173e83c93277b4a725ccc1bcfba9"} Apr 16 20:52:41.035259 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:41.035219 2570 generic.go:358] "Generic (PLEG): container finished" podID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerID="75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595" exitCode=0 Apr 16 20:52:41.035659 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:41.035268 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" event={"ID":"a3bcfa34-7746-401c-81a0-32c076be46e1","Type":"ContainerDied","Data":"75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595"} Apr 16 20:52:42.041024 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:42.040982 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" event={"ID":"a3bcfa34-7746-401c-81a0-32c076be46e1","Type":"ContainerStarted","Data":"e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910"} Apr 16 20:52:42.041463 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:42.041224 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:52:42.042232 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:42.042207 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:52:42.063329 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:42.063282 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podStartSLOduration=3.063266512 podStartE2EDuration="3.063266512s" podCreationTimestamp="2026-04-16 20:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:52:42.061519025 +0000 UTC m=+3525.596385061" watchObservedRunningTime="2026-04-16 20:52:42.063266512 +0000 UTC m=+3525.598132548" Apr 16 20:52:42.761121 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:42.761073 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.74:8080: connect: connection refused" Apr 16 20:52:43.045589 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:43.045505 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:52:43.462511 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:43.462485 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:52:43.501735 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:43.501692 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d1d0183-ade3-4e91-9a48-848b915ffe06-kserve-provision-location\") pod \"3d1d0183-ade3-4e91-9a48-848b915ffe06\" (UID: \"3d1d0183-ade3-4e91-9a48-848b915ffe06\") " Apr 16 20:52:43.502033 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:43.502010 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1d0183-ade3-4e91-9a48-848b915ffe06-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3d1d0183-ade3-4e91-9a48-848b915ffe06" (UID: "3d1d0183-ade3-4e91-9a48-848b915ffe06"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:52:43.603390 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:43.603289 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d1d0183-ade3-4e91-9a48-848b915ffe06-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:52:44.050167 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.050132 2570 generic.go:358] "Generic (PLEG): container finished" podID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerID="caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f" exitCode=0 Apr 16 20:52:44.050594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.050193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" event={"ID":"3d1d0183-ade3-4e91-9a48-848b915ffe06","Type":"ContainerDied","Data":"caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f"} Apr 16 20:52:44.050594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.050197 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" Apr 16 20:52:44.050594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.050220 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc" event={"ID":"3d1d0183-ade3-4e91-9a48-848b915ffe06","Type":"ContainerDied","Data":"dbc5447dacdf156aa0250eb02fa32ec8946ce97f6506c835e7d4d38da859edaa"} Apr 16 20:52:44.050594 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.050235 2570 scope.go:117] "RemoveContainer" containerID="caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f" Apr 16 20:52:44.059231 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.059208 2570 scope.go:117] "RemoveContainer" containerID="51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883" Apr 16 20:52:44.066807 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.066786 2570 scope.go:117] "RemoveContainer" containerID="caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f" Apr 16 20:52:44.067094 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:52:44.067071 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f\": container with ID starting with caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f not found: ID does not exist" containerID="caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f" Apr 16 20:52:44.067168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.067104 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f"} err="failed to get container status \"caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f\": rpc error: code = NotFound desc = could not find container \"caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f\": container with ID starting with caa0f5c8b4ac74a7d30e2d901bd0ea50ce69558f59f8153f2e65ddbbb62c874f not found: ID does not exist" Apr 16 20:52:44.067168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.067124 2570 scope.go:117] "RemoveContainer" containerID="51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883" Apr 16 20:52:44.067364 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:52:44.067337 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883\": container with ID starting with 51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883 not found: ID does not exist" containerID="51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883" Apr 16 20:52:44.067404 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.067375 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883"} err="failed to get container status \"51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883\": rpc error: code = NotFound desc = could not find container \"51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883\": container with ID starting with 51cef4fc29150551234818fb0daf0cec108fa4c7c1f24474fb7d88d98fdd4883 not found: ID does not exist" Apr 16 20:52:44.075637 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.075615 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc"] Apr 16 20:52:44.088174 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:44.088145 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-9wmwc"] Apr 16 20:52:45.112333 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:45.112296 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" path="/var/lib/kubelet/pods/3d1d0183-ade3-4e91-9a48-848b915ffe06/volumes" Apr 16 20:52:53.045836 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:52:53.045790 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:53:03.046406 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:53:03.046355 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:53:13.046422 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:53:13.046375 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:53:23.045764 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:53:23.045714 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:53:33.046347 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:53:33.046304 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:53:43.046471 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:53:43.046427 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:53:53.046259 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:53:53.046228 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:53:59.140568 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:53:59.140531 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm"] Apr 16 20:53:59.141043 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:53:59.140867 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" containerID="cri-o://e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910" gracePeriod=30 Apr 16 20:54:00.203101 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.203064 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578"] Apr 16 20:54:00.203583 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.203499 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" Apr 16 20:54:00.203583 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.203518 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" Apr 16 20:54:00.203583 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.203539 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="storage-initializer" Apr 16 20:54:00.203583 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.203545 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="storage-initializer" Apr 16 20:54:00.203788 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.203612 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d1d0183-ade3-4e91-9a48-848b915ffe06" containerName="kserve-container" Apr 16 20:54:00.206691 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.206674 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" Apr 16 20:54:00.219273 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.219242 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578"] Apr 16 20:54:00.374271 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.374231 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bac58b0-9136-4f45-8e42-e0726210310c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578\" (UID: \"6bac58b0-9136-4f45-8e42-e0726210310c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" Apr 16 20:54:00.475335 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.475247 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bac58b0-9136-4f45-8e42-e0726210310c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578\" (UID: \"6bac58b0-9136-4f45-8e42-e0726210310c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" Apr 16 20:54:00.475656 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.475632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bac58b0-9136-4f45-8e42-e0726210310c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578\" (UID: \"6bac58b0-9136-4f45-8e42-e0726210310c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" Apr 16 20:54:00.516326 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.516275 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" Apr 16 20:54:00.642482 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:00.642454 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578"] Apr 16 20:54:00.645293 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:54:00.645260 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bac58b0_9136_4f45_8e42_e0726210310c.slice/crio-bbb456504fee25a6be0540d7b9c2d7137e39a566fdec5588614b35b95d830292 WatchSource:0}: Error finding container bbb456504fee25a6be0540d7b9c2d7137e39a566fdec5588614b35b95d830292: Status 404 returned error can't find the container with id bbb456504fee25a6be0540d7b9c2d7137e39a566fdec5588614b35b95d830292 Apr 16 20:54:01.315477 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:01.315430 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" event={"ID":"6bac58b0-9136-4f45-8e42-e0726210310c","Type":"ContainerStarted","Data":"e2ccaa69fbfab6c850d41bd8a1e27f555e9d6e7128465ad31c2749140fc71b65"} Apr 16 20:54:01.315904 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:01.315485 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" event={"ID":"6bac58b0-9136-4f45-8e42-e0726210310c","Type":"ContainerStarted","Data":"bbb456504fee25a6be0540d7b9c2d7137e39a566fdec5588614b35b95d830292"} Apr 16 20:54:03.045985 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:03.045933 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Apr 16 20:54:03.583023 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:03.582994 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:54:03.704486 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:03.704390 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3bcfa34-7746-401c-81a0-32c076be46e1-kserve-provision-location\") pod \"a3bcfa34-7746-401c-81a0-32c076be46e1\" (UID: \"a3bcfa34-7746-401c-81a0-32c076be46e1\") " Apr 16 20:54:03.704486 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:03.704433 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3bcfa34-7746-401c-81a0-32c076be46e1-cabundle-cert\") pod \"a3bcfa34-7746-401c-81a0-32c076be46e1\" (UID: \"a3bcfa34-7746-401c-81a0-32c076be46e1\") " Apr 16 20:54:03.704706 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:03.704665 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3bcfa34-7746-401c-81a0-32c076be46e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3bcfa34-7746-401c-81a0-32c076be46e1" (UID: "a3bcfa34-7746-401c-81a0-32c076be46e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:54:03.704887 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:03.704860 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3bcfa34-7746-401c-81a0-32c076be46e1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a3bcfa34-7746-401c-81a0-32c076be46e1" (UID: "a3bcfa34-7746-401c-81a0-32c076be46e1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:54:03.805728 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:03.805688 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3bcfa34-7746-401c-81a0-32c076be46e1-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:54:03.805728 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:03.805724 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3bcfa34-7746-401c-81a0-32c076be46e1-cabundle-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:54:04.328091 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.328042 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578_6bac58b0-9136-4f45-8e42-e0726210310c/storage-initializer/0.log" Apr 16 20:54:04.328655 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.328108 2570 generic.go:358] "Generic (PLEG): container finished" podID="6bac58b0-9136-4f45-8e42-e0726210310c" containerID="e2ccaa69fbfab6c850d41bd8a1e27f555e9d6e7128465ad31c2749140fc71b65" exitCode=1 Apr 16 20:54:04.328655 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.328192 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" event={"ID":"6bac58b0-9136-4f45-8e42-e0726210310c","Type":"ContainerDied","Data":"e2ccaa69fbfab6c850d41bd8a1e27f555e9d6e7128465ad31c2749140fc71b65"} Apr 16 20:54:04.329849 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.329809 2570 generic.go:358] "Generic (PLEG): container finished" podID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerID="e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910" exitCode=0 Apr 16 20:54:04.329942 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.329893 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" Apr 16 20:54:04.329942 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.329894 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" event={"ID":"a3bcfa34-7746-401c-81a0-32c076be46e1","Type":"ContainerDied","Data":"e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910"} Apr 16 20:54:04.329942 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.329927 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm" event={"ID":"a3bcfa34-7746-401c-81a0-32c076be46e1","Type":"ContainerDied","Data":"09f72ac30ed4cd8233d10615bd8eb4014051173e83c93277b4a725ccc1bcfba9"} Apr 16 20:54:04.329942 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.329942 2570 scope.go:117] "RemoveContainer" containerID="e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910" Apr 16 20:54:04.341846 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.341824 2570 scope.go:117] "RemoveContainer" containerID="75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595" Apr 16 20:54:04.357852 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.357830 2570 scope.go:117] "RemoveContainer" containerID="e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910" Apr 16 20:54:04.358232 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:54:04.358204 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910\": container with ID starting with e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910 not found: ID does not exist" containerID="e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910" Apr 16 20:54:04.358316 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.358245 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910"} err="failed to get container status \"e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910\": rpc error: code = NotFound desc = could not find container \"e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910\": container with ID starting with e5dcc49dde4ffe7c6eef33926e5735cb077f14fbda8b8fe925755e29889f4910 not found: ID does not exist" Apr 16 20:54:04.358316 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.358272 2570 scope.go:117] "RemoveContainer" containerID="75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595" Apr 16 20:54:04.358567 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:54:04.358549 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595\": container with ID starting with 75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595 not found: ID does not exist" containerID="75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595" Apr 16 20:54:04.358624 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.358573 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595"} err="failed to get container status \"75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595\": rpc error: code = NotFound desc = could not find container \"75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595\": container with ID starting with 75a783d1d6f86cdcbf99bd1b85040c992bd96b64fd277a6ea11c721ece6b1595 not found: ID does not exist" Apr 16 20:54:04.361480 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.361456 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm"] Apr 16 20:54:04.367277 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:04.367255 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-6hfgm"] Apr 16 20:54:05.115370 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:05.115327 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" path="/var/lib/kubelet/pods/a3bcfa34-7746-401c-81a0-32c076be46e1/volumes" Apr 16 20:54:05.335183 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:05.335157 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578_6bac58b0-9136-4f45-8e42-e0726210310c/storage-initializer/0.log" Apr 16 20:54:05.335613 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:05.335261 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" event={"ID":"6bac58b0-9136-4f45-8e42-e0726210310c","Type":"ContainerStarted","Data":"475b9c11929271e0bf57fbdc75c9a11dab23679c87373823c6cc720ed2895878"} Apr 16 20:54:10.256838 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.256791 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578"] Apr 16 20:54:10.257346 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.257173 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" podUID="6bac58b0-9136-4f45-8e42-e0726210310c" containerName="storage-initializer" containerID="cri-o://475b9c11929271e0bf57fbdc75c9a11dab23679c87373823c6cc720ed2895878" gracePeriod=30 Apr 16 20:54:10.354164 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.354138 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578_6bac58b0-9136-4f45-8e42-e0726210310c/storage-initializer/1.log" Apr 16 20:54:10.354543 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.354522 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578_6bac58b0-9136-4f45-8e42-e0726210310c/storage-initializer/0.log" Apr 16 20:54:10.354656 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.354566 2570 generic.go:358] "Generic (PLEG): container finished" podID="6bac58b0-9136-4f45-8e42-e0726210310c" containerID="475b9c11929271e0bf57fbdc75c9a11dab23679c87373823c6cc720ed2895878" exitCode=1 Apr 16 20:54:10.354656 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.354637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" event={"ID":"6bac58b0-9136-4f45-8e42-e0726210310c","Type":"ContainerDied","Data":"475b9c11929271e0bf57fbdc75c9a11dab23679c87373823c6cc720ed2895878"} Apr 16 20:54:10.354763 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.354682 2570 scope.go:117] "RemoveContainer" containerID="e2ccaa69fbfab6c850d41bd8a1e27f555e9d6e7128465ad31c2749140fc71b65" Apr 16 20:54:10.396166 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.396145 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578_6bac58b0-9136-4f45-8e42-e0726210310c/storage-initializer/1.log" Apr 16 20:54:10.396281 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.396212 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" Apr 16 20:54:10.563824 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.563727 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bac58b0-9136-4f45-8e42-e0726210310c-kserve-provision-location\") pod \"6bac58b0-9136-4f45-8e42-e0726210310c\" (UID: \"6bac58b0-9136-4f45-8e42-e0726210310c\") " Apr 16 20:54:10.564032 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.564008 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bac58b0-9136-4f45-8e42-e0726210310c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6bac58b0-9136-4f45-8e42-e0726210310c" (UID: "6bac58b0-9136-4f45-8e42-e0726210310c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:54:10.665241 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:10.665204 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bac58b0-9136-4f45-8e42-e0726210310c-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:54:11.316460 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316424 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll"] Apr 16 20:54:11.316854 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316830 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="storage-initializer" Apr 16 20:54:11.316854 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316846 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="storage-initializer" Apr 16 20:54:11.316931 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316870 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" Apr 16 20:54:11.316931 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316876 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" Apr 16 20:54:11.316931 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316886 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bac58b0-9136-4f45-8e42-e0726210310c" containerName="storage-initializer" Apr 16 20:54:11.316931 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316893 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bac58b0-9136-4f45-8e42-e0726210310c" containerName="storage-initializer" Apr 16 20:54:11.317087 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316946 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3bcfa34-7746-401c-81a0-32c076be46e1" containerName="kserve-container" Apr 16 20:54:11.317087 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316957 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bac58b0-9136-4f45-8e42-e0726210310c" containerName="storage-initializer" Apr 16 20:54:11.317087 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.316965 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bac58b0-9136-4f45-8e42-e0726210310c" containerName="storage-initializer" Apr 16 20:54:11.317087 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.317035 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bac58b0-9136-4f45-8e42-e0726210310c" containerName="storage-initializer" Apr 16 20:54:11.317087 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.317041 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bac58b0-9136-4f45-8e42-e0726210310c" containerName="storage-initializer" Apr 16 20:54:11.321475 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.321454 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:11.325977 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.325956 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 20:54:11.336771 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.336745 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll"] Apr 16 20:54:11.359431 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.359404 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578_6bac58b0-9136-4f45-8e42-e0726210310c/storage-initializer/1.log" Apr 16 20:54:11.359599 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.359500 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" event={"ID":"6bac58b0-9136-4f45-8e42-e0726210310c","Type":"ContainerDied","Data":"bbb456504fee25a6be0540d7b9c2d7137e39a566fdec5588614b35b95d830292"} Apr 16 20:54:11.359599 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.359524 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578" Apr 16 20:54:11.359599 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.359538 2570 scope.go:117] "RemoveContainer" containerID="475b9c11929271e0bf57fbdc75c9a11dab23679c87373823c6cc720ed2895878" Apr 16 20:54:11.370135 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.370072 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll\" (UID: \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:11.370135 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.370127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll\" (UID: \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:11.399746 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.399712 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578"] Apr 16 20:54:11.401403 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.401377 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-gp578"] Apr 16 20:54:11.470813 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.470781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll\" (UID: \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:11.471004 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.470887 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll\" (UID: \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:11.471265 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.471243 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll\" (UID: \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:11.471442 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.471420 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll\" (UID: \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:11.631577 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.631480 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:11.753990 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:11.753957 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll"] Apr 16 20:54:11.757598 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:54:11.757564 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a71a811_11db_4d5f_9500_5f8a5a7fca5e.slice/crio-e29623c971a2a6175f1bc7d75339c001b8fc16c4ba7a5d2a5795b70319fe0cc0 WatchSource:0}: Error finding container e29623c971a2a6175f1bc7d75339c001b8fc16c4ba7a5d2a5795b70319fe0cc0: Status 404 returned error can't find the container with id e29623c971a2a6175f1bc7d75339c001b8fc16c4ba7a5d2a5795b70319fe0cc0 Apr 16 20:54:12.364903 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:12.364859 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" event={"ID":"6a71a811-11db-4d5f-9500-5f8a5a7fca5e","Type":"ContainerStarted","Data":"4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e"} Apr 16 20:54:12.364903 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:12.364907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" event={"ID":"6a71a811-11db-4d5f-9500-5f8a5a7fca5e","Type":"ContainerStarted","Data":"e29623c971a2a6175f1bc7d75339c001b8fc16c4ba7a5d2a5795b70319fe0cc0"} Apr 16 20:54:13.111761 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:13.111677 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bac58b0-9136-4f45-8e42-e0726210310c" path="/var/lib/kubelet/pods/6bac58b0-9136-4f45-8e42-e0726210310c/volumes" Apr 16 20:54:13.371030 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:13.370937 2570 generic.go:358] "Generic (PLEG): container finished" podID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerID="4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e" exitCode=0 Apr 16 20:54:13.371444 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:13.371030 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" event={"ID":"6a71a811-11db-4d5f-9500-5f8a5a7fca5e","Type":"ContainerDied","Data":"4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e"} Apr 16 20:54:14.375654 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:14.375620 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" event={"ID":"6a71a811-11db-4d5f-9500-5f8a5a7fca5e","Type":"ContainerStarted","Data":"7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6"} Apr 16 20:54:14.376083 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:14.375845 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:54:14.377198 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:14.377170 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:54:14.394100 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:14.394017 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podStartSLOduration=3.394002062 podStartE2EDuration="3.394002062s" podCreationTimestamp="2026-04-16 20:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:54:14.39275415 +0000 UTC m=+3617.927620199" watchObservedRunningTime="2026-04-16 20:54:14.394002062 +0000 UTC m=+3617.928868169" Apr 16 20:54:15.379104 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:15.379040 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:54:25.379863 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:25.379809 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:54:35.379448 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:35.379395 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:54:45.379761 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:45.379642 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:54:55.379214 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:54:55.379168 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:55:05.379508 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:05.379462 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:55:15.379590 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:15.379487 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:55:25.380256 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:25.380222 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:55:31.321385 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:31.321351 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll"] Apr 16 20:55:31.321913 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:31.321605 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" containerID="cri-o://7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6" gracePeriod=30 Apr 16 20:55:32.392602 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:32.392564 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx"] Apr 16 20:55:32.396341 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:32.396320 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" Apr 16 20:55:32.404384 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:32.404352 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx"] Apr 16 20:55:32.459459 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:32.459408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab4395d-cf1e-4be3-90a1-112c78ed2fe6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx\" (UID: \"eab4395d-cf1e-4be3-90a1-112c78ed2fe6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" Apr 16 20:55:32.560716 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:32.560665 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab4395d-cf1e-4be3-90a1-112c78ed2fe6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx\" (UID: \"eab4395d-cf1e-4be3-90a1-112c78ed2fe6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" Apr 16 20:55:32.561048 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:32.561024 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab4395d-cf1e-4be3-90a1-112c78ed2fe6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx\" (UID: \"eab4395d-cf1e-4be3-90a1-112c78ed2fe6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" Apr 16 20:55:32.707948 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:32.707848 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" Apr 16 20:55:32.841492 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:32.841428 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx"] Apr 16 20:55:32.844491 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:55:32.844464 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeab4395d_cf1e_4be3_90a1_112c78ed2fe6.slice/crio-6792e6424c3fee0cb7825dde28269d33e9c147bc77984fee0e4e1d2259194757 WatchSource:0}: Error finding container 6792e6424c3fee0cb7825dde28269d33e9c147bc77984fee0e4e1d2259194757: Status 404 returned error can't find the container with id 6792e6424c3fee0cb7825dde28269d33e9c147bc77984fee0e4e1d2259194757 Apr 16 20:55:33.656042 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:33.656002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" event={"ID":"eab4395d-cf1e-4be3-90a1-112c78ed2fe6","Type":"ContainerStarted","Data":"3a456c5db1d8db915d5dbae9e7af56f8a25dc6c3c872fbcfeb18db6636df5132"} Apr 16 20:55:33.656494 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:33.656045 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" event={"ID":"eab4395d-cf1e-4be3-90a1-112c78ed2fe6","Type":"ContainerStarted","Data":"6792e6424c3fee0cb7825dde28269d33e9c147bc77984fee0e4e1d2259194757"} Apr 16 20:55:35.380026 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:35.379979 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Apr 16 20:55:35.859536 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:35.859510 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:55:35.993706 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:35.993618 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-cabundle-cert\") pod \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\" (UID: \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\") " Apr 16 20:55:35.993706 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:35.993688 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-kserve-provision-location\") pod \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\" (UID: \"6a71a811-11db-4d5f-9500-5f8a5a7fca5e\") " Apr 16 20:55:35.993995 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:35.993973 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6a71a811-11db-4d5f-9500-5f8a5a7fca5e" (UID: "6a71a811-11db-4d5f-9500-5f8a5a7fca5e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:55:35.994041 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:35.993993 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a71a811-11db-4d5f-9500-5f8a5a7fca5e" (UID: "6a71a811-11db-4d5f-9500-5f8a5a7fca5e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:55:36.095203 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.095169 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-cabundle-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:55:36.095203 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.095199 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a71a811-11db-4d5f-9500-5f8a5a7fca5e-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:55:36.668874 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.668830 2570 generic.go:358] "Generic (PLEG): container finished" podID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerID="7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6" exitCode=0 Apr 16 20:55:36.669371 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.668882 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" event={"ID":"6a71a811-11db-4d5f-9500-5f8a5a7fca5e","Type":"ContainerDied","Data":"7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6"} Apr 16 20:55:36.669371 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.668918 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" event={"ID":"6a71a811-11db-4d5f-9500-5f8a5a7fca5e","Type":"ContainerDied","Data":"e29623c971a2a6175f1bc7d75339c001b8fc16c4ba7a5d2a5795b70319fe0cc0"} Apr 16 20:55:36.669371 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.668929 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll" Apr 16 20:55:36.669371 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.668938 2570 scope.go:117] "RemoveContainer" containerID="7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6" Apr 16 20:55:36.677873 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.677848 2570 scope.go:117] "RemoveContainer" containerID="4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e" Apr 16 20:55:36.685669 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.685649 2570 scope.go:117] "RemoveContainer" containerID="7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6" Apr 16 20:55:36.685899 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:55:36.685884 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6\": container with ID starting with 7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6 not found: ID does not exist" containerID="7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6" Apr 16 20:55:36.685950 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.685906 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6"} err="failed to get container status \"7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6\": rpc error: code = NotFound desc = could not find container \"7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6\": container with ID starting with 7a46a0bb94c3098f4d532d6ae9bce51e0099c816599e1ee5e622f99095067dc6 not found: ID does not exist" Apr 16 20:55:36.685950 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.685922 2570 scope.go:117] "RemoveContainer" containerID="4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e" Apr 16 20:55:36.686621 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:55:36.686601 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e\": container with ID starting with 4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e not found: ID does not exist" containerID="4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e" Apr 16 20:55:36.686676 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.686630 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e"} err="failed to get container status \"4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e\": rpc error: code = NotFound desc = could not find container \"4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e\": container with ID starting with 4110fd04e527d325ab43fb08681a98c88ea31ce807e28e6ca71a9c32e5d3b19e not found: ID does not exist" Apr 16 20:55:36.704826 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.704794 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll"] Apr 16 20:55:36.717855 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:36.717825 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-hkjll"] Apr 16 20:55:37.113656 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:37.113621 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" path="/var/lib/kubelet/pods/6a71a811-11db-4d5f-9500-5f8a5a7fca5e/volumes" Apr 16 20:55:39.682548 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:39.682518 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_eab4395d-cf1e-4be3-90a1-112c78ed2fe6/storage-initializer/0.log" Apr 16 20:55:39.682945 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:39.682558 2570 generic.go:358] "Generic (PLEG): container finished" podID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" containerID="3a456c5db1d8db915d5dbae9e7af56f8a25dc6c3c872fbcfeb18db6636df5132" exitCode=1 Apr 16 20:55:39.682945 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:39.682632 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" event={"ID":"eab4395d-cf1e-4be3-90a1-112c78ed2fe6","Type":"ContainerDied","Data":"3a456c5db1d8db915d5dbae9e7af56f8a25dc6c3c872fbcfeb18db6636df5132"} Apr 16 20:55:40.687228 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:40.687195 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_eab4395d-cf1e-4be3-90a1-112c78ed2fe6/storage-initializer/0.log" Apr 16 20:55:40.687735 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:40.687296 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" event={"ID":"eab4395d-cf1e-4be3-90a1-112c78ed2fe6","Type":"ContainerStarted","Data":"226403ccce45e82c9f55fcc856377cbb32a972b6c476b7a9586ead7f7bb14545"} Apr 16 20:55:41.693219 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:41.693133 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_eab4395d-cf1e-4be3-90a1-112c78ed2fe6/storage-initializer/1.log" Apr 16 20:55:41.693631 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:41.693506 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_eab4395d-cf1e-4be3-90a1-112c78ed2fe6/storage-initializer/0.log" Apr 16 20:55:41.693631 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:41.693538 2570 generic.go:358] "Generic (PLEG): container finished" podID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" containerID="226403ccce45e82c9f55fcc856377cbb32a972b6c476b7a9586ead7f7bb14545" exitCode=1 Apr 16 20:55:41.693631 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:41.693581 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" event={"ID":"eab4395d-cf1e-4be3-90a1-112c78ed2fe6","Type":"ContainerDied","Data":"226403ccce45e82c9f55fcc856377cbb32a972b6c476b7a9586ead7f7bb14545"} Apr 16 20:55:41.693631 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:41.693623 2570 scope.go:117] "RemoveContainer" containerID="3a456c5db1d8db915d5dbae9e7af56f8a25dc6c3c872fbcfeb18db6636df5132" Apr 16 20:55:41.693974 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:41.693959 2570 scope.go:117] "RemoveContainer" containerID="3a456c5db1d8db915d5dbae9e7af56f8a25dc6c3c872fbcfeb18db6636df5132" Apr 16 20:55:41.704321 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:55:41.704283 2570 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_kserve-ci-e2e-test_eab4395d-cf1e-4be3-90a1-112c78ed2fe6_0 in pod sandbox 6792e6424c3fee0cb7825dde28269d33e9c147bc77984fee0e4e1d2259194757 from index: no such id: '3a456c5db1d8db915d5dbae9e7af56f8a25dc6c3c872fbcfeb18db6636df5132'" containerID="3a456c5db1d8db915d5dbae9e7af56f8a25dc6c3c872fbcfeb18db6636df5132" Apr 16 20:55:41.704408 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:55:41.704347 2570 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_kserve-ci-e2e-test_eab4395d-cf1e-4be3-90a1-112c78ed2fe6_0 in pod sandbox 6792e6424c3fee0cb7825dde28269d33e9c147bc77984fee0e4e1d2259194757 from index: no such id: '3a456c5db1d8db915d5dbae9e7af56f8a25dc6c3c872fbcfeb18db6636df5132'; Skipping pod \"isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_kserve-ci-e2e-test(eab4395d-cf1e-4be3-90a1-112c78ed2fe6)\"" logger="UnhandledError" Apr 16 20:55:41.705692 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:55:41.705670 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_kserve-ci-e2e-test(eab4395d-cf1e-4be3-90a1-112c78ed2fe6)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" podUID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" Apr 16 20:55:42.394701 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:42.394667 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx"] Apr 16 20:55:42.699350 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:42.699269 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_eab4395d-cf1e-4be3-90a1-112c78ed2fe6/storage-initializer/1.log" Apr 16 20:55:42.831476 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:42.831455 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_eab4395d-cf1e-4be3-90a1-112c78ed2fe6/storage-initializer/1.log" Apr 16 20:55:42.831607 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:42.831515 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" Apr 16 20:55:42.956687 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:42.956583 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab4395d-cf1e-4be3-90a1-112c78ed2fe6-kserve-provision-location\") pod \"eab4395d-cf1e-4be3-90a1-112c78ed2fe6\" (UID: \"eab4395d-cf1e-4be3-90a1-112c78ed2fe6\") " Apr 16 20:55:42.956902 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:42.956880 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eab4395d-cf1e-4be3-90a1-112c78ed2fe6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eab4395d-cf1e-4be3-90a1-112c78ed2fe6" (UID: "eab4395d-cf1e-4be3-90a1-112c78ed2fe6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:55:43.058030 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.057977 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab4395d-cf1e-4be3-90a1-112c78ed2fe6-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:55:43.464044 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.463998 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g"] Apr 16 20:55:43.464417 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464403 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" containerName="storage-initializer" Apr 16 20:55:43.464417 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464418 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" containerName="storage-initializer" Apr 16 20:55:43.464501 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464428 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" Apr 16 20:55:43.464501 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464434 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" Apr 16 20:55:43.464501 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464443 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" containerName="storage-initializer" Apr 16 20:55:43.464501 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464448 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" containerName="storage-initializer" Apr 16 20:55:43.464501 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464459 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="storage-initializer" Apr 16 20:55:43.464501 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464464 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="storage-initializer" Apr 16 20:55:43.464720 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464541 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" containerName="storage-initializer" Apr 16 20:55:43.464720 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464549 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a71a811-11db-4d5f-9500-5f8a5a7fca5e" containerName="kserve-container" Apr 16 20:55:43.464720 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.464559 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" containerName="storage-initializer" Apr 16 20:55:43.468863 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.468843 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:43.471308 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.471288 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 20:55:43.478358 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.478332 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g"] Apr 16 20:55:43.561810 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.561760 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/576d1d0a-215c-47d6-acf2-23e5fbc614e0-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g\" (UID: \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:43.562042 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.561883 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/576d1d0a-215c-47d6-acf2-23e5fbc614e0-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g\" (UID: \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:43.663176 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.663139 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/576d1d0a-215c-47d6-acf2-23e5fbc614e0-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g\" (UID: \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:43.663317 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.663205 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/576d1d0a-215c-47d6-acf2-23e5fbc614e0-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g\" (UID: \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:43.663523 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.663501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/576d1d0a-215c-47d6-acf2-23e5fbc614e0-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g\" (UID: \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:43.663760 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.663741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/576d1d0a-215c-47d6-acf2-23e5fbc614e0-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g\" (UID: \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:43.703771 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.703742 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx_eab4395d-cf1e-4be3-90a1-112c78ed2fe6/storage-initializer/1.log" Apr 16 20:55:43.704202 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.703837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" event={"ID":"eab4395d-cf1e-4be3-90a1-112c78ed2fe6","Type":"ContainerDied","Data":"6792e6424c3fee0cb7825dde28269d33e9c147bc77984fee0e4e1d2259194757"} Apr 16 20:55:43.704202 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.703883 2570 scope.go:117] "RemoveContainer" containerID="226403ccce45e82c9f55fcc856377cbb32a972b6c476b7a9586ead7f7bb14545" Apr 16 20:55:43.704202 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.703885 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx" Apr 16 20:55:43.736645 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.736566 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx"] Apr 16 20:55:43.741305 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.741274 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-flkvx"] Apr 16 20:55:43.780145 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.780015 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:43.904532 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:43.904507 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g"] Apr 16 20:55:43.906370 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:55:43.906341 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576d1d0a_215c_47d6_acf2_23e5fbc614e0.slice/crio-9771f02a2cbe37aa8cfa1876d6faeb353802d4fa2cfcd01ea57599ab33636e13 WatchSource:0}: Error finding container 9771f02a2cbe37aa8cfa1876d6faeb353802d4fa2cfcd01ea57599ab33636e13: Status 404 returned error can't find the container with id 9771f02a2cbe37aa8cfa1876d6faeb353802d4fa2cfcd01ea57599ab33636e13 Apr 16 20:55:44.710035 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:44.709995 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" event={"ID":"576d1d0a-215c-47d6-acf2-23e5fbc614e0","Type":"ContainerStarted","Data":"ee159aa4931c6299cf1ae38ccf7f333e4f5e9bc01e1c7e8c041318d32d997a84"} Apr 16 20:55:44.710035 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:44.710035 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" event={"ID":"576d1d0a-215c-47d6-acf2-23e5fbc614e0","Type":"ContainerStarted","Data":"9771f02a2cbe37aa8cfa1876d6faeb353802d4fa2cfcd01ea57599ab33636e13"} Apr 16 20:55:45.111645 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:45.111608 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab4395d-cf1e-4be3-90a1-112c78ed2fe6" path="/var/lib/kubelet/pods/eab4395d-cf1e-4be3-90a1-112c78ed2fe6/volumes" Apr 16 20:55:45.719275 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:45.719235 2570 generic.go:358] "Generic (PLEG): container finished" podID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerID="ee159aa4931c6299cf1ae38ccf7f333e4f5e9bc01e1c7e8c041318d32d997a84" exitCode=0 Apr 16 20:55:45.719746 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:45.719321 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" event={"ID":"576d1d0a-215c-47d6-acf2-23e5fbc614e0","Type":"ContainerDied","Data":"ee159aa4931c6299cf1ae38ccf7f333e4f5e9bc01e1c7e8c041318d32d997a84"} Apr 16 20:55:46.724352 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:46.724314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" event={"ID":"576d1d0a-215c-47d6-acf2-23e5fbc614e0","Type":"ContainerStarted","Data":"b4e6531a7203745b7807d7da65edf0ed60d3e13b269c5a8d1c254dbd087a11db"} Apr 16 20:55:46.724797 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:46.724515 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:55:46.725875 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:46.725842 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:55:46.741651 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:46.741604 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podStartSLOduration=3.741590017 podStartE2EDuration="3.741590017s" podCreationTimestamp="2026-04-16 20:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:55:46.740433331 +0000 UTC m=+3710.275299365" watchObservedRunningTime="2026-04-16 20:55:46.741590017 +0000 UTC m=+3710.276456052" Apr 16 20:55:47.727683 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:47.727646 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:55:57.728234 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:55:57.728185 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:56:07.728571 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:56:07.728525 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:56:17.727707 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:56:17.727657 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:56:27.728676 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:56:27.728630 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:56:37.728566 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:56:37.728511 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:56:47.728458 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:56:47.728360 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:56:57.729090 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:56:57.729037 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:57:03.497013 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:03.496977 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g"] Apr 16 20:57:03.497512 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:03.497219 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" containerID="cri-o://b4e6531a7203745b7807d7da65edf0ed60d3e13b269c5a8d1c254dbd087a11db" gracePeriod=30 Apr 16 20:57:04.549721 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:04.549678 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc"] Apr 16 20:57:04.553403 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:04.553376 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" Apr 16 20:57:04.561536 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:04.561514 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc"] Apr 16 20:57:04.677462 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:04.677414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660ff524-9745-4b61-b34e-14daac5e0fb3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc\" (UID: \"660ff524-9745-4b61-b34e-14daac5e0fb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" Apr 16 20:57:04.778418 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:04.778350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660ff524-9745-4b61-b34e-14daac5e0fb3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc\" (UID: \"660ff524-9745-4b61-b34e-14daac5e0fb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" Apr 16 20:57:04.778758 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:04.778736 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660ff524-9745-4b61-b34e-14daac5e0fb3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc\" (UID: \"660ff524-9745-4b61-b34e-14daac5e0fb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" Apr 16 20:57:04.864904 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:04.864801 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" Apr 16 20:57:04.994102 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:04.994075 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc"] Apr 16 20:57:04.996598 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:57:04.996572 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660ff524_9745_4b61_b34e_14daac5e0fb3.slice/crio-e6d6de676e8e9383fc0cecdc9219dc66bad50830fbc029cd5cab9880b6f3a0f5 WatchSource:0}: Error finding container e6d6de676e8e9383fc0cecdc9219dc66bad50830fbc029cd5cab9880b6f3a0f5: Status 404 returned error can't find the container with id e6d6de676e8e9383fc0cecdc9219dc66bad50830fbc029cd5cab9880b6f3a0f5 Apr 16 20:57:05.993815 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:05.993779 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" event={"ID":"660ff524-9745-4b61-b34e-14daac5e0fb3","Type":"ContainerStarted","Data":"353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4"} Apr 16 20:57:05.993815 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:05.993815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" event={"ID":"660ff524-9745-4b61-b34e-14daac5e0fb3","Type":"ContainerStarted","Data":"e6d6de676e8e9383fc0cecdc9219dc66bad50830fbc029cd5cab9880b6f3a0f5"} Apr 16 20:57:07.728465 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:07.728423 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.79:8080: connect: connection refused" Apr 16 20:57:08.002388 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.002356 2570 generic.go:358] "Generic (PLEG): container finished" podID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerID="b4e6531a7203745b7807d7da65edf0ed60d3e13b269c5a8d1c254dbd087a11db" exitCode=0 Apr 16 20:57:08.002590 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.002392 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" event={"ID":"576d1d0a-215c-47d6-acf2-23e5fbc614e0","Type":"ContainerDied","Data":"b4e6531a7203745b7807d7da65edf0ed60d3e13b269c5a8d1c254dbd087a11db"} Apr 16 20:57:08.046354 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.046332 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:57:08.109106 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.109046 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/576d1d0a-215c-47d6-acf2-23e5fbc614e0-cabundle-cert\") pod \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\" (UID: \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\") " Apr 16 20:57:08.109258 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.109175 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/576d1d0a-215c-47d6-acf2-23e5fbc614e0-kserve-provision-location\") pod \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\" (UID: \"576d1d0a-215c-47d6-acf2-23e5fbc614e0\") " Apr 16 20:57:08.109451 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.109429 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/576d1d0a-215c-47d6-acf2-23e5fbc614e0-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "576d1d0a-215c-47d6-acf2-23e5fbc614e0" (UID: "576d1d0a-215c-47d6-acf2-23e5fbc614e0"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:57:08.109505 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.109481 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/576d1d0a-215c-47d6-acf2-23e5fbc614e0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "576d1d0a-215c-47d6-acf2-23e5fbc614e0" (UID: "576d1d0a-215c-47d6-acf2-23e5fbc614e0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:57:08.210592 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.210553 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/576d1d0a-215c-47d6-acf2-23e5fbc614e0-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:57:08.210592 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:08.210586 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/576d1d0a-215c-47d6-acf2-23e5fbc614e0-cabundle-cert\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:57:09.008176 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:09.008133 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" event={"ID":"576d1d0a-215c-47d6-acf2-23e5fbc614e0","Type":"ContainerDied","Data":"9771f02a2cbe37aa8cfa1876d6faeb353802d4fa2cfcd01ea57599ab33636e13"} Apr 16 20:57:09.008653 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:09.008193 2570 scope.go:117] "RemoveContainer" containerID="b4e6531a7203745b7807d7da65edf0ed60d3e13b269c5a8d1c254dbd087a11db" Apr 16 20:57:09.008653 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:09.008211 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g" Apr 16 20:57:09.016964 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:09.016939 2570 scope.go:117] "RemoveContainer" containerID="ee159aa4931c6299cf1ae38ccf7f333e4f5e9bc01e1c7e8c041318d32d997a84" Apr 16 20:57:09.031777 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:09.031750 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g"] Apr 16 20:57:09.034810 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:09.034787 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-8s25g"] Apr 16 20:57:09.111855 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:09.111813 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" path="/var/lib/kubelet/pods/576d1d0a-215c-47d6-acf2-23e5fbc614e0/volumes" Apr 16 20:57:10.013849 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:10.013823 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc_660ff524-9745-4b61-b34e-14daac5e0fb3/storage-initializer/0.log" Apr 16 20:57:10.014267 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:10.013860 2570 generic.go:358] "Generic (PLEG): container finished" podID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerID="353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4" exitCode=1 Apr 16 20:57:10.014267 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:10.013896 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" event={"ID":"660ff524-9745-4b61-b34e-14daac5e0fb3","Type":"ContainerDied","Data":"353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4"} Apr 16 20:57:11.019150 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:11.019120 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc_660ff524-9745-4b61-b34e-14daac5e0fb3/storage-initializer/0.log" Apr 16 20:57:11.019581 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:11.019171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" event={"ID":"660ff524-9745-4b61-b34e-14daac5e0fb3","Type":"ContainerStarted","Data":"be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b"} Apr 16 20:57:14.605776 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:14.605738 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc"] Apr 16 20:57:14.606227 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:14.605993 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" podUID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerName="storage-initializer" containerID="cri-o://be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b" gracePeriod=30 Apr 16 20:57:15.354531 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:15.354505 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc_660ff524-9745-4b61-b34e-14daac5e0fb3/storage-initializer/1.log" Apr 16 20:57:15.354868 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:15.354849 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc_660ff524-9745-4b61-b34e-14daac5e0fb3/storage-initializer/0.log" Apr 16 20:57:15.354924 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:15.354916 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" Apr 16 20:57:15.473521 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:15.473431 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660ff524-9745-4b61-b34e-14daac5e0fb3-kserve-provision-location\") pod \"660ff524-9745-4b61-b34e-14daac5e0fb3\" (UID: \"660ff524-9745-4b61-b34e-14daac5e0fb3\") " Apr 16 20:57:15.473710 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:15.473687 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/660ff524-9745-4b61-b34e-14daac5e0fb3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "660ff524-9745-4b61-b34e-14daac5e0fb3" (UID: "660ff524-9745-4b61-b34e-14daac5e0fb3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:57:15.574835 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:15.574794 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660ff524-9745-4b61-b34e-14daac5e0fb3-kserve-provision-location\") on node \"ip-10-0-129-34.ec2.internal\" DevicePath \"\"" Apr 16 20:57:16.038445 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.038415 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc_660ff524-9745-4b61-b34e-14daac5e0fb3/storage-initializer/1.log" Apr 16 20:57:16.038862 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.038747 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc_660ff524-9745-4b61-b34e-14daac5e0fb3/storage-initializer/0.log" Apr 16 20:57:16.038862 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.038781 2570 generic.go:358] "Generic (PLEG): container finished" podID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerID="be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b" exitCode=1 Apr 16 20:57:16.038862 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.038817 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" event={"ID":"660ff524-9745-4b61-b34e-14daac5e0fb3","Type":"ContainerDied","Data":"be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b"} Apr 16 20:57:16.038862 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.038851 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" Apr 16 20:57:16.039042 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.038866 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc" event={"ID":"660ff524-9745-4b61-b34e-14daac5e0fb3","Type":"ContainerDied","Data":"e6d6de676e8e9383fc0cecdc9219dc66bad50830fbc029cd5cab9880b6f3a0f5"} Apr 16 20:57:16.039042 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.038880 2570 scope.go:117] "RemoveContainer" containerID="be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b" Apr 16 20:57:16.047251 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.047231 2570 scope.go:117] "RemoveContainer" containerID="353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4" Apr 16 20:57:16.054693 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.054674 2570 scope.go:117] "RemoveContainer" containerID="be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b" Apr 16 20:57:16.054941 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:57:16.054924 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b\": container with ID starting with be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b not found: ID does not exist" containerID="be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b" Apr 16 20:57:16.055011 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.054948 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b"} err="failed to get container status \"be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b\": rpc error: code = NotFound desc = could not find container \"be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b\": container with ID starting with be6176e94b08151d0067c5d1bf4af4a804ddb58ca129b7854591ed7bb719870b not found: ID does not exist" Apr 16 20:57:16.055011 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.054964 2570 scope.go:117] "RemoveContainer" containerID="353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4" Apr 16 20:57:16.055242 ip-10-0-129-34 kubenswrapper[2570]: E0416 20:57:16.055224 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4\": container with ID starting with 353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4 not found: ID does not exist" containerID="353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4" Apr 16 20:57:16.055309 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.055245 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4"} err="failed to get container status \"353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4\": rpc error: code = NotFound desc = could not find container \"353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4\": container with ID starting with 353c9ee0643b77b53a96ff25140d255a562c3ecfc86b0c35139f0473c7d85ae4 not found: ID does not exist" Apr 16 20:57:16.071694 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.071664 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc"] Apr 16 20:57:16.077251 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:16.077227 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-vblvc"] Apr 16 20:57:17.110942 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:17.110908 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660ff524-9745-4b61-b34e-14daac5e0fb3" path="/var/lib/kubelet/pods/660ff524-9745-4b61-b34e-14daac5e0fb3/volumes" Apr 16 20:57:41.971621 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.971582 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x4l79/must-gather-jjmxp"] Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972134 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerName="storage-initializer" Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972153 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerName="storage-initializer" Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972169 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="storage-initializer" Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972179 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="storage-initializer" Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972213 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972223 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972310 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerName="storage-initializer" Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972322 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerName="storage-initializer" Apr 16 20:57:41.972402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972336 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="576d1d0a-215c-47d6-acf2-23e5fbc614e0" containerName="kserve-container" Apr 16 20:57:41.972733 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972436 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerName="storage-initializer" Apr 16 20:57:41.972733 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.972446 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="660ff524-9745-4b61-b34e-14daac5e0fb3" containerName="storage-initializer" Apr 16 20:57:41.975613 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.975590 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x4l79/must-gather-jjmxp" Apr 16 20:57:41.978701 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.978663 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x4l79\"/\"kube-root-ca.crt\"" Apr 16 20:57:41.978829 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.978755 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x4l79\"/\"openshift-service-ca.crt\"" Apr 16 20:57:41.979864 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.979834 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x4l79\"/\"default-dockercfg-4rfsr\"" Apr 16 20:57:41.982381 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:41.982355 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x4l79/must-gather-jjmxp"] Apr 16 20:57:42.009288 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.009256 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/397d0a60-9dc1-4e1c-aac2-3045f840d62b-must-gather-output\") pod \"must-gather-jjmxp\" (UID: \"397d0a60-9dc1-4e1c-aac2-3045f840d62b\") " pod="openshift-must-gather-x4l79/must-gather-jjmxp" Apr 16 20:57:42.009448 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.009322 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hctsv\" (UniqueName: \"kubernetes.io/projected/397d0a60-9dc1-4e1c-aac2-3045f840d62b-kube-api-access-hctsv\") pod \"must-gather-jjmxp\" (UID: \"397d0a60-9dc1-4e1c-aac2-3045f840d62b\") " pod="openshift-must-gather-x4l79/must-gather-jjmxp" Apr 16 20:57:42.110500 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.110453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hctsv\" (UniqueName: \"kubernetes.io/projected/397d0a60-9dc1-4e1c-aac2-3045f840d62b-kube-api-access-hctsv\") pod \"must-gather-jjmxp\" (UID: \"397d0a60-9dc1-4e1c-aac2-3045f840d62b\") " pod="openshift-must-gather-x4l79/must-gather-jjmxp" Apr 16 20:57:42.110701 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.110546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/397d0a60-9dc1-4e1c-aac2-3045f840d62b-must-gather-output\") pod \"must-gather-jjmxp\" (UID: \"397d0a60-9dc1-4e1c-aac2-3045f840d62b\") " pod="openshift-must-gather-x4l79/must-gather-jjmxp" Apr 16 20:57:42.110894 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.110873 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/397d0a60-9dc1-4e1c-aac2-3045f840d62b-must-gather-output\") pod \"must-gather-jjmxp\" (UID: \"397d0a60-9dc1-4e1c-aac2-3045f840d62b\") " pod="openshift-must-gather-x4l79/must-gather-jjmxp" Apr 16 20:57:42.119186 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.119157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hctsv\" (UniqueName: \"kubernetes.io/projected/397d0a60-9dc1-4e1c-aac2-3045f840d62b-kube-api-access-hctsv\") pod \"must-gather-jjmxp\" (UID: \"397d0a60-9dc1-4e1c-aac2-3045f840d62b\") " pod="openshift-must-gather-x4l79/must-gather-jjmxp" Apr 16 20:57:42.295443 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.295401 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x4l79/must-gather-jjmxp" Apr 16 20:57:42.425555 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.422413 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x4l79/must-gather-jjmxp"] Apr 16 20:57:42.429464 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:42.429433 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:57:43.134455 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:43.134414 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/must-gather-jjmxp" event={"ID":"397d0a60-9dc1-4e1c-aac2-3045f840d62b","Type":"ContainerStarted","Data":"235e610dc72d0d2a678f1d63b72769b4911fea355b4931c203a39ede8fee8cde"} Apr 16 20:57:44.140685 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:44.140648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/must-gather-jjmxp" event={"ID":"397d0a60-9dc1-4e1c-aac2-3045f840d62b","Type":"ContainerStarted","Data":"8c5415ce6d345fd317661ea20289c1db0e1be08c0f71b573894530b4bcdbc91f"} Apr 16 20:57:44.141226 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:44.140692 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/must-gather-jjmxp" event={"ID":"397d0a60-9dc1-4e1c-aac2-3045f840d62b","Type":"ContainerStarted","Data":"4ef904bcf508862d859080b34de06338e3363d34fce5ddb055bfcbe698a49e8b"} Apr 16 20:57:44.157734 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:44.157670 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x4l79/must-gather-jjmxp" podStartSLOduration=2.441233638 podStartE2EDuration="3.157648349s" podCreationTimestamp="2026-04-16 20:57:41 +0000 UTC" firstStartedPulling="2026-04-16 20:57:42.429614444 +0000 UTC m=+3825.964480470" lastFinishedPulling="2026-04-16 20:57:43.146029167 +0000 UTC m=+3826.680895181" observedRunningTime="2026-04-16 20:57:44.157013589 +0000 UTC m=+3827.691879631" watchObservedRunningTime="2026-04-16 20:57:44.157648349 +0000 UTC m=+3827.692514398" Apr 16 20:57:44.743762 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:44.743727 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pcrsp_8b2defe9-7ceb-4351-8846-5d2d737476d8/global-pull-secret-syncer/0.log" Apr 16 20:57:44.834495 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:44.834461 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9ktg6_0114be71-3ca1-48b4-bcff-512d02284f83/konnectivity-agent/0.log" Apr 16 20:57:44.954728 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:44.954696 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-34.ec2.internal_52f7ef5b748605fa2e3167b9e181ddfa/haproxy/0.log" Apr 16 20:57:48.295565 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.295534 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0/alertmanager/0.log" Apr 16 20:57:48.316969 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.316916 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0/config-reloader/0.log" Apr 16 20:57:48.336461 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.336431 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0/kube-rbac-proxy-web/0.log" Apr 16 20:57:48.356521 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.356488 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0/kube-rbac-proxy/0.log" Apr 16 20:57:48.376167 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.376138 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0/kube-rbac-proxy-metric/0.log" Apr 16 20:57:48.398819 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.398783 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0/prom-label-proxy/0.log" Apr 16 20:57:48.423679 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.423566 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_58ad724e-d6fb-4e83-96a1-0cdd2d3a50b0/init-config-reloader/0.log" Apr 16 20:57:48.506719 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.506666 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wx7xc_93af7b8d-bce4-432e-885a-c48d5e8895fa/kube-state-metrics/0.log" Apr 16 20:57:48.528514 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.528482 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wx7xc_93af7b8d-bce4-432e-885a-c48d5e8895fa/kube-rbac-proxy-main/0.log" Apr 16 20:57:48.548788 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.548656 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wx7xc_93af7b8d-bce4-432e-885a-c48d5e8895fa/kube-rbac-proxy-self/0.log" Apr 16 20:57:48.590758 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.590724 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-57c6995b97-7nqjd_4099f14d-0af4-4fc9-ad94-aa5abbfc67ce/metrics-server/0.log" Apr 16 20:57:48.620317 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.620270 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-65ndw_8995f8ed-4eab-4bb2-a5a7-2df79fd493f4/monitoring-plugin/0.log" Apr 16 20:57:48.790696 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.790667 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmzrv_a2fc1956-9cb0-45a4-8b02-f764c61c9655/node-exporter/0.log" Apr 16 20:57:48.813517 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.813431 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmzrv_a2fc1956-9cb0-45a4-8b02-f764c61c9655/kube-rbac-proxy/0.log" Apr 16 20:57:48.832960 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.832931 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmzrv_a2fc1956-9cb0-45a4-8b02-f764c61c9655/init-textfile/0.log" Apr 16 20:57:48.861151 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.861120 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-f47kp_5193f0a6-8fbd-4c14-b92c-1eae57fc248b/kube-rbac-proxy-main/0.log" Apr 16 20:57:48.883697 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.883663 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-f47kp_5193f0a6-8fbd-4c14-b92c-1eae57fc248b/kube-rbac-proxy-self/0.log" Apr 16 20:57:48.905330 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:48.905294 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-f47kp_5193f0a6-8fbd-4c14-b92c-1eae57fc248b/openshift-state-metrics/0.log" Apr 16 20:57:49.106491 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.106398 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-cblsl_6a2f4e5e-93ed-4a68-975d-93771007f8ff/prometheus-operator/0.log" Apr 16 20:57:49.128192 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.128159 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-cblsl_6a2f4e5e-93ed-4a68-975d-93771007f8ff/kube-rbac-proxy/0.log" Apr 16 20:57:49.182498 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.182467 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-57b9c5ddb9-tsmww_fc862099-3346-4ac3-b874-451d17baebf9/telemeter-client/0.log" Apr 16 20:57:49.203715 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.203648 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-57b9c5ddb9-tsmww_fc862099-3346-4ac3-b874-451d17baebf9/reload/0.log" Apr 16 20:57:49.228139 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.228109 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-57b9c5ddb9-tsmww_fc862099-3346-4ac3-b874-451d17baebf9/kube-rbac-proxy/0.log" Apr 16 20:57:49.267334 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.267299 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6896fd559d-s8tfj_fa467cab-b8b0-4afb-9a81-a5ed7c53cd97/thanos-query/0.log" Apr 16 20:57:49.291188 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.291153 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6896fd559d-s8tfj_fa467cab-b8b0-4afb-9a81-a5ed7c53cd97/kube-rbac-proxy-web/0.log" Apr 16 20:57:49.313019 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.312990 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6896fd559d-s8tfj_fa467cab-b8b0-4afb-9a81-a5ed7c53cd97/kube-rbac-proxy/0.log" Apr 16 20:57:49.335041 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.335012 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6896fd559d-s8tfj_fa467cab-b8b0-4afb-9a81-a5ed7c53cd97/prom-label-proxy/0.log" Apr 16 20:57:49.355614 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.355585 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6896fd559d-s8tfj_fa467cab-b8b0-4afb-9a81-a5ed7c53cd97/kube-rbac-proxy-rules/0.log" Apr 16 20:57:49.376828 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:49.376752 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6896fd559d-s8tfj_fa467cab-b8b0-4afb-9a81-a5ed7c53cd97/kube-rbac-proxy-metrics/0.log" Apr 16 20:57:51.372370 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:51.372336 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f588d468-xszx9_ef57cf43-51ab-4af0-aa56-6cc5a51fe4c8/console/0.log" Apr 16 20:57:52.149646 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.149605 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m"] Apr 16 20:57:52.155432 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.155399 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.161402 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.161371 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m"] Apr 16 20:57:52.205560 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.205521 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-lib-modules\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.205907 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.205887 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zllt\" (UniqueName: \"kubernetes.io/projected/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-kube-api-access-7zllt\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.206020 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.206007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-proc\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.206198 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.206183 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-podres\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.206337 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.206325 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-sys\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.307793 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.307743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-podres\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.307990 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.307892 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-sys\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.307990 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.307938 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-lib-modules\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.307990 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.307966 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-podres\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.308168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.307991 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zllt\" (UniqueName: \"kubernetes.io/projected/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-kube-api-access-7zllt\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.308168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.308010 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-proc\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.308168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.308020 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-sys\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.308168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.308105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-proc\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.308168 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.308110 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-lib-modules\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.316417 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.316386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zllt\" (UniqueName: \"kubernetes.io/projected/b7a745e4-7532-4293-ac6d-3e8e0da81dcb-kube-api-access-7zllt\") pod \"perf-node-gather-daemonset-p422m\" (UID: \"b7a745e4-7532-4293-ac6d-3e8e0da81dcb\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.470292 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.470212 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:52.496838 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.496799 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nw756_133eb9cd-8eb0-4366-869a-c12b276770b4/dns/0.log" Apr 16 20:57:52.519221 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.518726 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nw756_133eb9cd-8eb0-4366-869a-c12b276770b4/kube-rbac-proxy/0.log" Apr 16 20:57:52.603041 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.603011 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m"] Apr 16 20:57:52.605123 ip-10-0-129-34 kubenswrapper[2570]: W0416 20:57:52.605093 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb7a745e4_7532_4293_ac6d_3e8e0da81dcb.slice/crio-5d575604fd2dbf56bcc9989b5274aa7b58185f3aeaa9e4dc1bf670d432d3535e WatchSource:0}: Error finding container 5d575604fd2dbf56bcc9989b5274aa7b58185f3aeaa9e4dc1bf670d432d3535e: Status 404 returned error can't find the container with id 5d575604fd2dbf56bcc9989b5274aa7b58185f3aeaa9e4dc1bf670d432d3535e Apr 16 20:57:52.648708 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:52.648681 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ck5pm_5246af7c-5c30-44f4-9bad-2723e872fdf5/dns-node-resolver/0.log" Apr 16 20:57:53.114452 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:53.114423 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2f4zv_d2fc2ca6-b29a-4acb-90f2-20b9a6a8854b/node-ca/0.log" Apr 16 20:57:53.190516 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:53.190473 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" event={"ID":"b7a745e4-7532-4293-ac6d-3e8e0da81dcb","Type":"ContainerStarted","Data":"3e677de9093739b0225d7853bad0f1b75906abcb49203dd900527e77b2e4955a"} Apr 16 20:57:53.190516 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:53.190522 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" event={"ID":"b7a745e4-7532-4293-ac6d-3e8e0da81dcb","Type":"ContainerStarted","Data":"5d575604fd2dbf56bcc9989b5274aa7b58185f3aeaa9e4dc1bf670d432d3535e"} Apr 16 20:57:53.191325 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:53.191292 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:57:53.213972 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:53.213895 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" podStartSLOduration=1.213873193 podStartE2EDuration="1.213873193s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:57:53.209979712 +0000 UTC m=+3836.744845749" watchObservedRunningTime="2026-04-16 20:57:53.213873193 +0000 UTC m=+3836.748739233" Apr 16 20:57:54.303265 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:54.303230 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kql9s_1d5abd6b-52ed-4fb7-997a-abcbe592b7af/serve-healthcheck-canary/0.log" Apr 16 20:57:54.700966 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:54.700889 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2drhq_897e1deb-27fc-453c-ab4d-8c48687f74c4/kube-rbac-proxy/0.log" Apr 16 20:57:54.723283 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:54.723253 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2drhq_897e1deb-27fc-453c-ab4d-8c48687f74c4/exporter/0.log" Apr 16 20:57:54.762233 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:54.762192 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2drhq_897e1deb-27fc-453c-ab4d-8c48687f74c4/extractor/0.log" Apr 16 20:57:56.989016 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:56.988985 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-xt5m8_8589cf1b-4575-44a0-92fb-65a3021168dd/manager/0.log" Apr 16 20:57:57.009694 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:57.009667 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-42lzj_ac6434a9-5e23-42f8-920a-c035adf4f4c2/server/0.log" Apr 16 20:57:57.254198 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:57.254163 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-d6hxk_cea8e105-f656-446f-9fe4-949adf51ce2a/manager/0.log" Apr 16 20:57:57.383788 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:57:57.383756 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-bxf6z_76018fe1-a7f6-440e-b2bd-f43e53652d09/seaweedfs-tls-custom/0.log" Apr 16 20:58:00.210406 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:00.210375 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-p422m" Apr 16 20:58:02.615802 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:02.615764 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljvqz_405a22ed-e497-47a7-95e5-0362e26a6e43/kube-multus-additional-cni-plugins/0.log" Apr 16 20:58:02.638046 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:02.637965 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljvqz_405a22ed-e497-47a7-95e5-0362e26a6e43/egress-router-binary-copy/0.log" Apr 16 20:58:02.659212 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:02.659185 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljvqz_405a22ed-e497-47a7-95e5-0362e26a6e43/cni-plugins/0.log" Apr 16 20:58:02.682242 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:02.682214 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljvqz_405a22ed-e497-47a7-95e5-0362e26a6e43/bond-cni-plugin/0.log" Apr 16 20:58:02.705365 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:02.705333 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljvqz_405a22ed-e497-47a7-95e5-0362e26a6e43/routeoverride-cni/0.log" Apr 16 20:58:02.726823 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:02.726749 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljvqz_405a22ed-e497-47a7-95e5-0362e26a6e43/whereabouts-cni-bincopy/0.log" Apr 16 20:58:02.747334 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:02.747303 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljvqz_405a22ed-e497-47a7-95e5-0362e26a6e43/whereabouts-cni/0.log" Apr 16 20:58:02.956687 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:02.956651 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-njxv9_1ceab864-ede8-473b-8607-10b5f8b271d4/kube-multus/0.log" Apr 16 20:58:03.043684 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:03.043643 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8x8wb_4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1/network-metrics-daemon/0.log" Apr 16 20:58:03.061781 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:03.061748 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8x8wb_4a9e66bb-af9f-4bcc-9d1f-a9b851f41ba1/kube-rbac-proxy/0.log" Apr 16 20:58:04.659509 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:04.659468 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5tmb_1a71a363-02b5-43c8-ac58-44ba0eb22832/ovn-controller/0.log" Apr 16 20:58:04.693757 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:04.693725 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5tmb_1a71a363-02b5-43c8-ac58-44ba0eb22832/ovn-acl-logging/0.log" Apr 16 20:58:04.710531 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:04.710495 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5tmb_1a71a363-02b5-43c8-ac58-44ba0eb22832/kube-rbac-proxy-node/0.log" Apr 16 20:58:04.729993 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:04.729964 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5tmb_1a71a363-02b5-43c8-ac58-44ba0eb22832/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:58:04.754316 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:04.754280 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5tmb_1a71a363-02b5-43c8-ac58-44ba0eb22832/northd/0.log" Apr 16 20:58:04.777322 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:04.777292 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5tmb_1a71a363-02b5-43c8-ac58-44ba0eb22832/nbdb/0.log" Apr 16 20:58:04.799494 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:04.799464 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5tmb_1a71a363-02b5-43c8-ac58-44ba0eb22832/sbdb/0.log" Apr 16 20:58:04.923640 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:04.923551 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5tmb_1a71a363-02b5-43c8-ac58-44ba0eb22832/ovnkube-controller/0.log" Apr 16 20:58:06.007382 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:06.007349 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rt6tz_e33890cf-1250-4378-b5a9-3ef264f23dad/network-check-target-container/0.log" Apr 16 20:58:06.999964 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:06.999929 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-ktbn6_d600b8d5-af76-4864-85f2-894bc334d737/iptables-alerter/0.log" Apr 16 20:58:07.714508 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:07.714477 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-b4vm2_3d50365e-fb34-48f7-a1c1-833ed9e44ff1/tuned/0.log" Apr 16 20:58:11.197813 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:11.197789 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gwhqd_5ad55dd7-a35c-4704-8ecf-446e4cc0c66f/csi-driver/0.log" Apr 16 20:58:11.224907 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:11.224873 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gwhqd_5ad55dd7-a35c-4704-8ecf-446e4cc0c66f/csi-node-driver-registrar/0.log" Apr 16 20:58:11.244815 ip-10-0-129-34 kubenswrapper[2570]: I0416 20:58:11.244785 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gwhqd_5ad55dd7-a35c-4704-8ecf-446e4cc0c66f/csi-liveness-probe/0.log"